Security Implications of AI-assisted Coding

Security Implications of AI-assisted coding

GitHub quietly announced the technical preview for their new Copilot feature recently. Copilot is an AI-assisted pair programming tool that can be used in VS Code or in a GitHub codespace. The tool uses a machine learning model based on all public GitHub source code, to allow for in-depth suggestions and code completion as you code. This isn’t the first product like this of its kind, as Microsoft offers its own IntelliCode code completion tool, but what makes Copilot exciting is that it’s based off the very promising OpenAI Codex system. This is undoubtedly the beginning of what many programmers have been hoping for – automated coding! Developers will be excited about the fact that GitHub claims the tool can assist with writing unit tests, because who wants to write those? 

Copilot is definitely exciting. Look at what the tool comes up with by just creating the name for the function “send_tweet_with_image” in Python: