Vapi Prompt Injection - user data
**Here is go again TLDR:**
You can inject user (and system) messages in to the transcripts as you confirm data (silently) so they are picked up by the analysis but not heard by the caller.
  • this is a follow up from an earlier message related to injecting system level data suing make, you can also inject 'user' messages which have a use, as outlined below.
Something else I wanted to note, you can send messages back to the control url when you are in the call - and you can send user message, as well as server.
This is could be useful to get data in to the chat (it's not spoken) but you can see it in the transcript, and therefore have the LLM analysis pick it up
An example: I have a solution that is asking the user to confirm a machine is his, however, I need the serial number for a process later and I don't want the AI to read back A11F-123-X23.
I have the serial number (this is multiple machine data injected up front) sent in to the system prompt when the call starts, but for whatever reason you don't get system message injections in the transcript flow (so they are not analysed).
So to get round this you need to create a tool that when the confirmed machine you fire off a tool call to inject a 'user' message and an empty webhook 200 response. There we have it - if you want a mini make vid, let me know.
**Extra**
Theoretically, and this goes a bit deep, you could create a tool call for say "user" massages, and "system" the are both 'async' (depending) on your use case and with an end point that produces a 200 response (so the tool doesn't fail when called) then use these to so all sort of stuff in the active call... and anyone who get why you may want to will get it - but be careful.
2
11 comments
Stuart Edwards
4
Vapi Prompt Injection - user data
Amplify Voice
skool.com/amplify-voice-9801
Featuring Alejo's Resource Hub, Community Build Hours, Live Streams, and Events dedicated to amplifying builders and creators in the Voice AI Space.
Leaderboard (30-day)
powered by