Why AWS isn't adding new languages to CodeWhisperer just yet
AWS' GM for CodeWhisperer identified language parity as a key focus
AWS has targeted greater precision for its CodeWhisperer service and aims to ensure code suggestions are of equal quality regardless of language.
Customers on the new CodeWhisperer Enterprise tier will be able to create a ‘customization’ for CodeWhisperer, which connects the service via AWS’ CodeStar Connections API to a repository such as GitLab, GitHub, or an Amazon Simple Storage Service (S3) bucket.
To focus on a specific section of code rather than all the code in a repository, customers can separate that code into a separate S3 bucket and connect CodeWhisperer directly to that.
The new feature comes at a time of heightened innovation in the space of generative AI, with CodeWhisperer sharing a market with a wide range of other AI assistants such as Microsoft’s Copilot.
AWS' CodeWhisperer goals
Speaking to ITPro, Doug Seven, general manager of Amazon CodeWhisperer, said the primary strategy for the coding assistant right now is to continually improve its effectiveness with the 15 programming languages it knows.
“I'm really concerned about things like parity among the languages,” said Seven.
“The experience I have in Python, I want to be the same as the experience I have in Rust. And right now there's a disparity of those things in terms of what CodeWhisperer is able to do.”
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Seven also highlighted the disparity in CodeWhisperer’s security scanning feature, which checks code for known vulnerabilities, as it only works for a portion of the CodeWhisperer’s 15 languages.
“What’s fascinating about how these language models work is we could open the stable gates and let the language model generate code in any language. It's capable of doing it, we can let CodeWhisperer generate code in any language you want, but we restrict it because of our principles around responsible AI.”
One of AWS’ core concerns when it comes to using training data for CodeWhisperer suggestions is the extent to which licensed open-source code has been used to inform model output.
Seven identified this as another reason AWS is cautiously moving forward with extending language support, as it aims to give customers clear notice of the code they are using for security purposes and legal peace of mind.
Last week’s announced update was a big step forward for the service, and Amazon Bedrock in general. But Seven said that outside of release calendars, updates for CodeWhisperer are subtle and frequent.
“If you were to compare CodeWhisperer in April to CodeWhisperer today, it actually makes fewer but more relevant suggestions.
“We added some capabilities to better predict when you might need a suggestion, and when you're most open to getting the code suggestions. So we can make fewer suggestions to you, but make sure that those suggestions are much more relevant.
AWS developers have also changed the way that CodeWhisperer works with specific programming languages in this time.
“If you go back two weeks, if you were coding in Go or Rust, or you were writing SQL commands, you would have gotten single-line, code suggestions. We would have been giving you one line at a time.
“Today, you'll get function block suggestions in five more languages than you would have a week ago.”
CodeWhisperer language capabilities
Of the 15 programming languages that CodeWhisperer can generate, initialization will only be available for Python, Java, and JavaScript. Seven stated that in the immediate future, the service will also extend to TypeScript and C#.
RELATED RESOURCE
Remove the friction of interacting with the health system and reduce the demands on healthcare providers with AWS
AWS teased new AI features at the end of September as Amazon Bedrock reached general availability. The service provides customers with access to a range of generative AI models including Amazon’s own Titan foundation models and those made by Anthropic
The firm also promised to bring Meta’s open LLM Llama 2, which can be fine-tuned to compete with the likes of ChatGPT and even GPT-4 in certain tasks, to the platform in the coming weeks.
Asked if this meant Bedrock customers could soon have a choice between CodeWhisperer and Meta’s pair programmer offshoot of Llama 2, Code Llama, Seven indicated that AWS doesn’t see the two systems as competitors.
“Eventually. I'm sure, we'll have other models available that you can work with through APIs and eventually be able to do other things with them.
“It’s a little bit independent, CodeWhisperer is really more of an abstraction like, ‘’Hey, you don't want to get into the weeds of that, we've done that for you, we're presenting an application that does these things’.”
“Over the course of time, the vision for CodeWhisperer is really a productivity tool that is enabled by AI. And so, we look at all of the challenges that developers face when writing code and try to figure out ways that we can use these tools and technologies we have to make that easier.”

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.
In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at rory.bathgate@futurenet.com or on LinkedIn.
-
What businesses need to know about data sovereigntyWithout a firm strategy for data sovereignty, businesses put their data and reputations at risk
-
Anthropic says MCP will stay 'open, neutral, and community-driven' after donating project to Linux FoundationNews The AAIF aims to standardize agentic AI development and create an open ecosystem for developers
-
AWS has dived headfirst into the agentic AI hype cycle, but old tricks will help it chart new watersOpinion While AWS has jumped on the agentic AI hype train, its reputation as a no-nonsense, reliable cloud provider will pay dividends
-
Want to build your own frontier AI model? Amazon Nova Forge can help with thatNews The new service aims to lower bar for enterprises without the financial resources to build in-house frontier models
-
AWS CEO Matt Garman says AI agents will have 'as much impact on your business as the internet or cloud'News Garman told attendees at AWS re:Invent that AI agents represent a paradigm shift in the trajectory of AI and will finally unlock returns on investment for enterprises.
-
AWS targets IT modernization gains with new agentic AI features in TransformNews New custom agents aim to speed up legacy code modernization and mainframe overhauls
-
Moving generative AI from proof of concept to production: a strategic guide for public sector successGenerative AI can transform the public sector but not without concrete plans for adoption and modernized infrastructure
-
Box and AWS announce new multi-year AI collaborationNews The agreement includes fresh integrations between Box and AWS, including support for Amazon Quick Suite and Amazon Q Developer
-
Everything you need to know about OpenAI's new open weight AI models, including price, performance, and where you can access themNews The two open weight models from OpenAI, gpt-oss-120b and gpt-oss-20b, are available under the Apache 2.0 license.
-
AWS to give AI skills to 100,000 people in the UK by 2030Cloud giant wants to inspire the next Charles Babbage and Ada Lovelace with an AI-training initiative that pulls government, business, and education together