Amazon updates development environment powering Alexa

New language and version controls boost bot development experience

Amazon has announced an update to Lex, its conversational artificial intelligence (AI) interface service for applications, in order to make it easier to build bots with support for multiple languages.

Lex is the cloud-based service that powers Amazon’s Alexa speech-based virtual assistant. The company also offers it as a service that allows people to build virtual agents, conversational IVR systems, self-service chatbots, or informational bots.

Organizations define conversational flows using a management console that then produces a bot they can attach to various applications, like Facebook Messenger.

The company released its Version 2 enhancements and a collection of updates to the application programming interface (API) used to access the service.

One of the biggest V2  enhancements is the additional language support. Developers can now add multiple languages to a single bot, managing them collectively throughout the development and deployment process. 

According to Martin Beeby, principal advocate for Amazon Web Services, developers can add new languages during development and switch between them to compare conversations.

The updated development tooling also simplifies version control to track different bot versions more easily. Previously, developers had to version a bot's underlying components individually, but the new feature allows them to version at the bot level.

Related Resource

Security analytics for your multi-cloud deployments

IBM Security QRadar SIEM solution brief

Security analytics for your multi-cloud deployments - whitepaper from IBMDownload now

Lex also comes with new productivity features, including saving partially completed bots and uploading sample utterances in bulk. A new configuration process makes it easier for developers to understand where they are in their bot's configuration, Beeby added.

Finally, Lex now features a streaming conversation API that can handle interruptions in the conversation flow. It can accommodate typical conversational speed bumps, such as a user pausing to think or asking to hold for a moment while looking up some information.

Featured Resources

Unlocking collaboration: Making software work better together

How to improve collaboration and agility with the right tech

Download now

Four steps to field service excellence

How to thrive in the experience economy

Download now

Six things a developer should know about Postgres

Why enterprises are choosing PostgreSQL

Download now

The path to CX excellence for B2B services

The four stages to thrive in the experience economy

Download now

Recommended

Amazon supports proposed Biden tax hike
Policy & legislation

Amazon supports proposed Biden tax hike

7 Apr 2021
Verizon and AWS partner on private mobile edge computing
hybrid cloud

Verizon and AWS partner on private mobile edge computing

5 Apr 2021
Amazon’s new Lookout for Metrics monitors your organization’s KPIs
Amazon Web Services (AWS)

Amazon’s new Lookout for Metrics monitors your organization’s KPIs

26 Mar 2021
AWS introduces fully managed Fault Injection Simulator
Amazon Web Services (AWS)

AWS introduces fully managed Fault Injection Simulator

16 Mar 2021

Most Popular

Microsoft is submerging servers in boiling liquid to prevent Teams outages
data centres

Microsoft is submerging servers in boiling liquid to prevent Teams outages

7 Apr 2021
Hackers are using fake messages to break into WhatsApp accounts
instant messaging (IM)

Hackers are using fake messages to break into WhatsApp accounts

8 Apr 2021
How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

8 Apr 2021