Trying out Vercel’s Instant Deploy apps

Srishti Mishra
5 min readJan 12, 2025

AI tools to build out experiences have definitely gotten better in the last 2 years! While I’m comfortable deploying directly on cloud platforms like Azure/AWS, I wanted to dabble in the hype of the new ultra-smooth wrappers to deploying full apps in seconds and scaling fast.

I tried out Vercel’s Chatbot template + used Cursor.ai for customization. Got caught on a couple of snags on the way and thought I’d put it out here if anyone’s stuck.

Vercel’s templates page🔗 has a bunch of cool ideas that you can deploy in a single click (and a v0 where you can literally chat, describe and deploy!). I chose the Nextjs Chatbot which connects to openAI models by default.

Single Click Deployment

Awesome and simple to get started + creates a git repo. Then setup any dependencies, integrate storage (if needed) and find your API keys!

Prototyping

For any app, these 3 concepts help me fix and debug issues quickly. They’re helpful whether you have years of experience in building apps or if you’re just starting out.

  1. Storage & Environment Variables — Even for infra wrappers, these need to be set up correctly. They’re the values that tell an app how to ‘talk’ or ‘integrate’ with databases and other services. For me, the Vercel set-up page showed me the required apps (NeonDB for storage etc) but then failed to connect (perhaps an edge-cases or free account limitations etc). If this happens, you need to browse through the ‘Storage & Settings tabs’ in the project to manually fix these.
  2. Logs and data flow — Add logs wherever you can. If you’re using frameworks like React or Nextjs etc, use Cursor/Copilot/Perplexity etc to quickly understand the basic project setup. Use the ‘Logs’ tab to follow the flow.
  3. For customization — keep track of hardcoded variables in the template projects. If you’re adding customizations, this is likely where things break. The good part is the Github Issues for Vercel’s default templates may have already captured these (look through both open/closed issues).

Storage

I could use the same storage blob to connect to multiple projects and spin up NeonDB instances for each project on their free plan —

Environment variables

The .env.example file in the Github repo mention the necessary environment variables with instructions on generating them.

And they can be added in the Settings tab in the web interface —

A little more theory as we get into customizations (even with Cursor/Copilot)! Most apps follow an MVC architecture or it’s variations and it’s great to read up on it. Especially if you’re customizing an app further — these 3 questions should give you a good idea of how the web app works -

Where is data sent in by the user and how is it processed?

When and where in the UI code is it sent to the server?

How is the data updated in the UI and displayed?

Now it’s deployed! I can click on my deployment from the deployment tab (or create a new one with the + icon on the top left), and navigate to the website.

Note: When signing up using the default chatbot template, remember to set a password > 6 characters long, or it returns a not so informative ‘failed to validate’ popup message.

Adding Mistral/additional models to the Chatbot

Even though OpenAI models work by default, I wanted to test out Mistral/Azure/Gemini/Perplexity etc other models in the chatbot, so I tried adding them to the repo. Wasn’t the most straightforward — but maybe that’s because I went in thinking it was aimed at low-code.

First off, Vercel has a neat AI playground to compare models here https://sdk.vercel.ai/playground/ with sample code to integrate each model (screenshot) and documentation for adding additional models & providers here — https://sdk.vercel.ai/providers/ai-sdk-providers/mistral

Now in the chatbot codebase:

The models are currently in /lib/ai/models.ts, and we can simply add additional models after ensuring the API identifier matches the one from the playground or official provider APIs-

  1. The current chatbot has a chat API call with the latest OpenAI experimental functions/tools API implemented in the app/(chat)/api/chat/route.ts
  2. This references a customModel() which is found in /lib/ai/index.ts, which can be updated with your provider.

A more detailed explanation with code samples on how to add additional models is in this Github comment 🔗 in this issue🔗 (hopefully resolved in a neater way by the time you read this!)

Storage with NeonDB

Next, I needed a database for my chatbot to interface with — now Vercel provides a NeonDB integration for storage, which runs on postgres SQL. It also has some UI + CLI interfaces to get started with creating new tables and data entry (note: a lot of people recommend Supabase here, but Neon was already there, and the free plan offers about 500MB of data.)

Easy to create a table using the UI, adding schema and columns with their data types. For data entry, I installed psql on my terminal, connected to my NeonDB, imported my csv file and voila I could see it in the UI almost immediately!

Import from CSV instructions 🔗 — https://neon.tech/docs/import/import-from-csv (tiny catch: I had to preface my table with “public”.“tableName” for this to work)

Connecting the app logic to new database tables

The database schema & querying is in the lib/db part of the codebase. Specifically, add your new schema into schema.ts using the drizzle ORM format, update the imports, and create any new queries in queries.ts for CRUD (create/read/update/delete operations) on the data.

Finally, hook up the chat routes to query your database and integrate it with the responses, or use the experimental tools.

And that’s pretty much it! Pretty nice for prototyping!

P.S- last month I built a fun extension to convert any website to an oddly-satisfying browsing experience (a.k.a brainrot) — if you try it out, let me know if you like it! https://chromewebstore.google.com/detail/web-2-brainrot/plimemogopfdmlfejdfbmldjaipihafa 🔗

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

No responses yet

Write a response