Why It Matters for AI-Coded Apps
AI coding tools frequently generate string-concatenated SQL queries, especially in quick prototypes. Our scans found that 34% of vibe-coded apps with database access used at least one vulnerable query pattern. LLMs often prioritize readability over security when constructing database queries.
Real-World Example
A login form with the query SELECT * FROM users WHERE username = '" + username + "' AND password = '" + password + "'. An attacker enters ' OR '1'='1' -- as the username, which transforms the query into SELECT * FROM users WHERE username = '' OR '1'='1' --' AND password = '', bypassing authentication entirely.
How to Detect and Prevent It
Always use parameterized queries or prepared statements. Use an ORM (Prisma, SQLAlchemy, ActiveRecord) which handles parameterization automatically. Never concatenate user input into SQL strings. Apply least-privilege database permissions so even a successful injection limits damage.
Frequently Asked Questions
Can ORMs prevent SQL injection completely?
$queryRaw or SQLAlchemy’s text()) can still be vulnerable if you concatenate user input. Always parameterize even raw queries.