For optimal performance when querying a large dataset in a NoSQL database, it is crucial to have proper ______ in place.

  • Indexing
  • Sharding
  • Versioning
  • Compression
For optimal performance when querying a large dataset in a NoSQL database, it is crucial to have proper indexing in place. Indexes help the database quickly locate and retrieve the data, reducing query response times.

You are developing a web application that needs to make API requests to a server on a different domain. How would you handle CORS to ensure that your web application can interact with the server without any issues?

  • Set up a server-side proxy to forward requests to the other domain.
  • Enable CORS on the server by adding appropriate headers to allow cross-origin requests.
  • Use JSONP for making cross-domain requests.
  • Disable CORS restrictions in the browser settings.
The correct approach is to enable CORS on the server by adding the appropriate headers that allow cross-origin requests. Option A can be a workaround, but it's not the best practice. Option C is an outdated method, and option D is not a recommended security practice.

You are tasked with creating an HTTP proxy server using the http module. What steps and considerations would you take to handle incoming requests and forward them to the appropriate destination?

  • Parse the incoming request, extract the destination URL, and create a new request to the destination server.
  • Directly pass the incoming request to the destination server without any parsing.
  • Block all incoming requests as proxy servers should only send responses.
  • Use the 'fs' module to read and forward the request to the destination.
To create an HTTP proxy server, you should parse the incoming request, extract the destination URL, and create a new request to the destination server. The other options are not suitable for proxy server implementations.

To optimize the serving of static files in Express.js, enabling caching is a common practice.

  • Compression
  • Minification
  • Preloading
  • Caching
Enabling caching is a common practice in Express.js to optimize the serving of static files. Caching allows the server to store and reuse static files, reducing the need to fetch them from the disk or generate them on each request, thus improving performance.

How can prototype pollution vulnerabilities be mitigated in JavaScript applications?

  • Avoid using third-party libraries
  • Use strong typing for all variables
  • Validate and sanitize user input
  • Disable JavaScript prototypes
To mitigate prototype pollution vulnerabilities in JavaScript applications, it's crucial to validate and sanitize user input. This prevents malicious input from corrupting object prototypes. Avoiding third-party libraries and using strong typing are good practices but do not directly address prototype pollution. Disabling prototypes would break core JavaScript functionality.

How can you ensure that a specific version of npm is used in your Node.js project?

  • By defining the npm version in the package.json file.
  • By modifying the NODE_PATH environment variable.
  • By running the 'npm use' command.
  • By using a package manager like Yarn instead of npm.
You can ensure a specific version of npm is used in your Node.js project by defining the desired npm version in the engines field of your project's package.json file. This helps maintain consistency and ensures that others working on the project use the correct npm version.

You are developing a RESTful API using the http module and need to support CORS. How would you implement CORS headers to handle pre-flight requests in your HTTP server?

  • Access-Control-Allow-Origin: *
  • Access-Control-Allow-Origin: mydomain.com
  • Access-Control-Allow-Origin: true
  • Access-Control-Allow-Origin: false
To support CORS in your HTTP server, you should set the Access-Control-Allow-Origin header to the specific domain (e.g., mydomain.com) from which you want to allow requests. Using a wildcard (*) is less secure as it allows any domain, and using true or false is not the correct syntax for this header.

You are developing a media hosting platform where users can upload images and videos. How would you design the file storage and retrieval system to ensure high availability and low latency for users across the globe?

  • Use a content delivery network (CDN) to cache and distribute media files to edge servers around the world.
  • Store all media files on a single, centralized server to simplify management.
  • Use a local file storage system for each region separately.
  • Implement a peer-to-peer (P2P) system for users to share media directly.
To ensure high availability and low latency, using a CDN is the recommended approach. CDNs cache content on edge servers located across the globe, reducing the distance between users and the data they request. The other options may lead to latency issues and lower availability.

A common practice for improving error handling in Express.js is to centralize error handling using a ______.

  • try-catch block
  • catchError middleware
  • globalErrorHandler middleware
  • throw statement
A common practice for improving error handling in Express.js is to centralize error handling using a globalErrorHandler middleware. This middleware is responsible for handling errors that occur during request processing. While try-catch blocks can be used within routes, they don't centralize error handling, and catchError and throw are not standard middleware constructs in Express.js.

You are tasked with developing a system to read and process large files without consuming a lot of memory. How would you utilize streams in Node.js to efficiently read, process, and write the data?

  • Use a Readable stream to read the file in chunks, process data asynchronously, and then write to a Writable stream.
  • Read the entire file into memory, process it, and then write it back to disk.
  • Use a synchronous file reading approach to minimize memory usage.
  • Use a single callback function to read and process the entire file sequentially.
To efficiently process large files, you should use a Readable stream to read the file in chunks, process data asynchronously, and then write to a Writable stream. This approach minimizes memory usage and is ideal for handling large files in Node.js.