5 Groups of Interesting AWS Launches in re:Invent 2018

  • by Emre Yilmaz
  • Dec 11, 2018
  • AWS
  • Istanbul
5 Groups of Interesting AWS Launches in re:Invent 2018

The major yearly AWS event, re:Invent 2018, took place two weeks ago. As always, there were new exciting launches announced from various fields, from serverless to machine learning, even blockchain. In this post, I will talk about 5 of these launches that I find more interesting and looking forward to explore more.

1 - Amazon Managed QLDB and Amazon Managed Blockchain

In my opinion, the biggest AWS launches of re:Invent 2018 are new managed blockchain services. As you might already noted, Blockchain is on rise for recent years and applied to more areas day by day. Of course, AWS could not be indifferent to this rise and popularity as there are lots of open source blockchain deployments in its platform and announced two new services related to blockchain: Amazon Managed Quantum Ledger Database (QLDB) and Amazon Managed Blockchain.

Amazon QLDB is a fully-managed, immutable, transparent and crypthographically verifiable ledger database. It is serverless and cost-effective. It is centralized and although this seems contradicting with the ideas behind the creation of blockchain; it might help you a lot, if you are building a ledger-like application and need to maintain an accurate history of application data kept as a central authority. You might be aiming to benefit from immutable and transparent structure of ledger databases and this service can save you from setting up and running this type of databases yourself.

On the other hand, Amazon Managed Blockchain might be what you are looking for if you would like to create and manage scalable blockchain networks using Hyperledger Fabric or Ethereum frameworks in a decentralized manner. It saves you from the burden of manually provisioning hardware, configuring software, and setting up networking and security when you build a self-hosted blockchain network. It scales automatically when new users join or leave. It can also be integrated with Amazon QLDB to store a copy of the network activity outside the network to analyze and get insights more.

2 - AWS Lambda Support for Ruby, Custom Lambda Runtimes and Lambda Layers

As a long-time Ruby developer, I was expecting the acceptance of Ruby as a programming language in AWS Lambda and while I was watching re:Invent 2018 from my Macbook, I was saying to myself: “They launched Go support last year, maybe annouce Ruby this year, right?”. Yes, it happened in a few minutes and Ruby is now supported by AWS Lambda. It will definitely be easier to migrate some Rails applications to serverless, if not all of them.

Besides, I think AWS aimed to solve the continuous ask from developers like me for new languages by announcing Lambda Runtime API. It allows placing a file named bootstrap which will communicate between your code and Lambda environment to execute your function. It can be an interpreter of a programming language and AWS made C++ and Rust available to be added as Lambda Layers. Let’s discuss what Lambda layers are, now.

If you have been developing AWS Lambda functions for a while, you might have seen that it was not possible to share code between AWS Lambda functions without copying and packaging the shared code alongside with every function needed them. For example, if you used Python’s pymysql library for connecting to a MySQL database, you had to add this library code inside your AWS Lambda packages every time. Although AWS Serverless Application Model and Serverless Framework made this process easier by automation, this was what they did behind the scenes. Now, AWS Lambda Layers solves this problem. You can package your shared code in a zip file and upload it as a Lambda layer. Then you can add this layer as a reference to your AWS Lambda functions in their configurations and use them in your code without. I believe this will be the new methodology for sharing codes between AWS Lambda functions from now on.

3 - Amazon Managed Streaming for Kafka

In the past, while talking to my clients, I saw that they setup an Apache Kafka cluster on AWS to stream large files. I remember myself talking them about how they maintained this setup in production, because I know that it requires time and effort to manage a structure like this.

However, provisioning and maintaining an Apache Kafka cluster will not be an issue anymore on AWS, because AWS announced Amazon Managed Streaming for Kafka in ReInvent 2018. It is a fully-managed service that provides creating highly available (multi-AZ) Apache Kafka clusters with a few clicks and makes the development of applications using Apache Kafka easier for you without installing, maintaining an Apache Kafka cluster yourself. You only develop producers and customers for your Apache Kafka stream and use this service as the Kafka streaming resource. It will definitely be helpful for this type of applications.

4 - Amazon DynamoDB On-Demand and DynamoDB Transactions

Amazon DynamoDB is a low latency, scalable, serverless No-SQL database which makes administering databases easier for developers in production. You can build master-master, multi-region global databases on Amazon DynamoDB and if your access patterns match it, I really recommend considering DynamoDB as a database in your applications.

In the past we could only provision read-write capacity units and needed to increase-decrease this capacity manually or with some custom automations. Then, AWS announced DynamoDB autoscaling which you can set upper-lower limits and it became a best-practice. Now, AWS enhanced it more by Amazon DynamoDB On-Demand if you do not know what usage patterns to expect. You only setup the database without provisioning any capacity and then it handles the rest by scaling in-out itself according to load. But, it is not supported in free-tier.

Another difficulty using Amazon DynamoDB was that it did not suit well to applications that require to handle programming logic in a transactional manner. If you put a record and needed to rollback later, you had to put a new update yourself to revert it back to its old state. However, DynamoDB Transactions now provides you atomicity, consistency, isolation, and durability (ACID) accross one or more tables in a single AWS account and region. This will allow you to develop more complex logic on DynamoDB that should be coordinated together. A good example to this is programming financial transactions.

5 - Amazon Personalize and Amazon Textract

There are lots of new launches in machine learning field. But, I would like to share two of them which made me curious: Amazon Personalize and Amazon Textract.

Amazon Personalize makes developing personalized recommendation engines easier for developers that have minimal machine learning experience. You provide your dataset in Amazon S3 or as a data stream using AWS Amplify. The requirement is that it should contain Amazon Personalize-reserved keywords for user ids, etc. Afterwards, you select a recommendation recipe while configuring Amazon Personalize and you can activate its new AutoML feature which is advertised as a new technique that automatically searches for the most optimal recipe. Then you have a recommendation engine with a few clicks. It is in preview mode; hence, I could not try it. But, it sounds to be useful for developing recommendation engines for websites and mobile applications.

Amazon Textract is a fully-managed AWS service that is used to extract text and data from scanned documents. It is like an OCR in the cloud for your usage. I am aware that there are OCR solutions in the web and used effectively. But, it will be good to have a solution to create programmatic workflows for processing documents in an automated, scalable and cost-effective way. Besides, AWS says that it is more than OCR and it understands contents of fields in forms and tables, too.


There are lots of new launches in ReInvent 2018 and you might find others more appealing depending on your use case. However, I find these 5 groups of launches more interesting and would like to try them in near future.

I hope you enjoyed the post.

Thanks for reading!



CEO @ Shikisoft

AWS Certified Solutions Architect & DevOps Engineer - Professional