Scenario: You need to implement a serverless architecture where incoming data from IoT devices triggers AWS Lambda functions for processing. How would you design the integration between SNS and AWS Lambda in this scenario?

  • Deploy EC2 instances
  • Publish data to SNS topic
  • Use S3 for data storage
  • Utilize Kinesis Data Streams
Publishing data to an SNS topic allows SNS to trigger AWS Lambda functions for processing the incoming data from IoT devices.

What is DynamoDB Streams primarily used for?

  • Automating backups
  • Capturing data modification events
  • Ensuring high availability
  • Managing database schema
DynamoDB Streams captures data modification events in a DynamoDB table, allowing you to track changes and trigger actions based on those changes.

How does DynamoDB Streams ensure data durability?

  • By creating backups
  • By replicating data across multiple regions
  • By storing data in memory
  • By writing data to disk
DynamoDB Streams ensure data durability by replicating data across multiple availability zones to prevent data loss.

In DynamoDB Streams, what triggers the generation of stream records?

  • Data modifications (create, update, delete)
  • Read operations
  • Schema changes
  • Table scans
Stream records in DynamoDB Streams are generated when data modifications such as create, update, and delete operations occur in the table.

How long does DynamoDB Streams retain records by default?

  • 24 hours
  • 30 days
  • 48 hours
  • 7 days
DynamoDB Streams retains records for 24 hours by default.

What is the purpose of a DynamoDB stream ARN (Amazon Resource Name)?

  • Creating backup snapshots
  • Granting IAM permissions
  • Identifying a specific stream
  • Monitoring stream activity
A DynamoDB stream ARN uniquely identifies a specific stream.

How can you ensure ordered processing of records in DynamoDB Streams?

  • Enable cross-region replication
  • Implement conditional writes
  • Increase read capacity units
  • Use partition keys
Using partition keys ensures that records with the same partition key are processed in order.

What are some use cases for integrating DynamoDB Streams with AWS Lambda?

  • Load balancing
  • Long-term data storage
  • Real-time analytics
  • Static website hosting
Real-time analytics, such as processing and analyzing data changes in real-time, is a key use case for integrating DynamoDB Streams with AWS Lambda.

How does DynamoDB Streams handle data consistency across multiple shards?

  • Batch processing
  • Parallel processing
  • Sequence numbers
  • Timestamps
DynamoDB Streams uses sequence numbers to maintain the correct order of records, ensuring data consistency across multiple shards.

What are the limitations of DynamoDB Streams regarding scalability and performance?

  • High latency
  • Lack of data encryption
  • Limited read throughput
  • Limited write capacity
The limited read throughput of DynamoDB Streams can impact scalability and performance, particularly when processing high volumes of data.