That friendly, ever-so-helpful AI coding assistant? You can’t trust it.
Most programmers now use AI coding assistants such as GitHub Copilot, ChatGPT, and Amazon Q Developer. In fact, according to a 2024 Stack Overflow survey, 76% of respondents already use or plan to use AI code assistants.
That may be a big mistake.
In an e-mail interview, Craig McLuckie, one of Kubernetes’ co-creators and founder and CEO of Stacklok, a software supply chain security company, told The New Stack, “Over the past weeks, I have watched AI coding assistants exfiltrate secrets to OpenAI, and I’ve seen various [large language models] recommend deprecated and dangerous (even hallucinated) packages that AI coding assistants then try to install.”
Yow!
It gets worse. “It gets doubly complicated because foreign adversaries have been busily publishing malicious packages with names that are commonly hallucinated,” McLuckie added.
To combat this problem, he said, StackLok has a new open source project, CodeGate. Locally hosted (i.e., run by developers on their own machine) is what he calls a “privacy-focused solution that acts as an essential layer of security within a developer’s generative AI workflow.”
How CodeGate Works
Specifically, CodeGate, licensed under Apache 2, acts as a local proxy between developers and AI coding assistants. The program runs within a dedicated Docker container. It ensures that sensitive information remains protected while leveraging AI’s productivity benefits.
CodeGate does this by monitoring prompts for code secrets, such as API keys and credentials. It encrypts your secrets on the fly as your code goes back and forth between your workstation and the AI service.
This commitment to privacy is a standout feature. The tool operates entirely on your local machine, ensuring no data except the coding assistant’s required traffic leaves your system.
The program also blocks potentially harmful libraries and deprecated dependencies by using a real-time database to identify them and intervening when an AI tool suggests such questionable components. As McLuckie told TNS, “It alerts the developer whenever an LLM recommends an unsafe dependency, but otherwise sits quietly in the background.”
CodeGate currently supports integration with popular AI providers such as OpenAI and Anthropic, as well as tools like GitHub Copilot and continue.dev. The developers plan to expand compatibility by including more tools, such as the AI pair programming tool aider and the AI code editor Cursor.
As the software development landscape evolves with AI integration, tools like CodeGate will play a crucial role in balancing the benefits of AI assistance with the necessary safeguards for security and privacy. CodeGate’s open source code base invites collaboration and scrutiny from the developer community, which should help accelerate improvements and widespread adoption.
The post CodeGate: Open Source Tool Secures AI Coding Assistants appeared first on The New Stack.