Get Started in 3 Steps
This guide will help you set up the AI Hackathon Guide locally and start using the AI chat assistant to find the best tools for your hackathon project.Clone and Install
Clone the repository and install dependencies:
This project uses npm as the package manager. Make sure you have Node.js 18+ installed.
Configure OpenAI API
The AI chat assistant requires an OpenAI API key. Get yours at platform.openai.com/api-keys.
- Environment File
- Export Variable
Copy the example environment file and add your API key:Then edit
.env and add:Never commit your
.env file or API keys to version control. The .env file is already in .gitignore.Using the AI Chat Assistant
Once the dev server is running, you can access the AI chat assistant from the sidebar:General Q&A Mode
General Q&A Mode
Ask general questions about tools, frameworks, or hackathon strategies:Example questions:
- “What’s the difference between Clerk and Auth0?”
- “How do I add realtime features to my app?”
- “Which deployment platform should I use for Next.js?”
gpt-4o-mini for fast, cost-effective responses.Stack Suggestion Mode
Stack Suggestion Mode
Click “Suggest a stack” to get personalized tech stack recommendations.The assistant will:
- Ask clarifying questions about your project (frontend framework, features needed, etc.)
- Analyze the available tools in the guide
- Return a complete stack recommendation with explanations
gpt-4o with structured JSON output for reliable recommendations.Available Commands
Here are all the commands you’ll need during development:Command Details
| Command | Description |
|---|---|
npm run dev | Starts Vite dev server with HMR and API middleware at /api/chat |
npm run build | Full production build: TypeScript check → Vite build → esbuild API bundle |
npm run build:api | Builds only the serverless chat function (server/chat.ts → api/chat.js) |
npm run lint | Runs ESLint on all TypeScript files |
npm run preview | Previews the production build locally |
npm run test | Runs Vitest in watch mode (auto-reruns on file changes) |
npm run test:run | Runs the full test suite once (use before committing) |
npm run test:coverage | Generates coverage report (opens coverage/index.html in browser) |
Testing Your Setup
Verify everything is working correctly:Check the Homepage
Navigate to
http://localhost:5173 and verify:- Tool sections load (Development Tools, Databases, Auth, etc.)
- Tool cards are visible and expandable
- Theme toggle works (dark/light mode)
Test the Chat Assistant
Open the chat panel from the sidebar and:
- Send a simple question like “What is Cursor?”
- Verify you get a response (if you see an error, check your API key)
- Try the “Suggest a stack” button
Deploying to Production
The guide is optimized for Vercel deployment:Import to Vercel
- Go to vercel.com/new
- Import your GitHub repository
- Vercel will auto-detect the Vite configuration
Vercel automatically builds both the static site (
dist/) and the serverless API function (api/chat.js) based on the vercel.json configuration.Project Structure
Understanding the architecture will help you navigate and contribute:Key Files
src/content/sections.ts
src/content/sections.ts
Single source of truth for all tool data.Each
Section contains:id: Unique identifiertitle: Display nametools: Array ofToolobjects
Tool includes:- Basic info:
name,tagline,description,url - Features:
bulletsarray - Details:
detailsGuide,detailsVideo,detailsSections
shared/chatPolicy.ts
shared/chatPolicy.ts
server/chat.ts
server/chat.ts
Thin Vercel serverless handler.Delegates all logic to
shared/chatPolicy.ts. Built to api/chat.js using esbuild:Next Steps
Now that you’re set up, explore the guide:Explore Development Tools
Check out Cursor, Codex, Replit, and other AI-powered editors
Browse Databases & Auth
See all databases, auth services, deployment platforms, and APIs
AI Chat Features
Learn about the AI chat assistant and tech stack recommendations
Contribute
Add new tools, fix bugs, or improve documentation
Troubleshooting
Chat returns API errors
Chat returns API errors
Possible causes:
- Missing API key: Verify
OPENAI_API_KEYis set in.envor environment - Invalid API key: Check that your key is correct at platform.openai.com
- No credits: Ensure your OpenAI account has available credits
- Rate limiting: Free tier has usage limits; wait or upgrade
Port already in use
Port already in use
If port 5173 is taken:
Tests failing
Tests failing
Common issues:
- Outdated dependencies: Run
npm install - TypeScript errors: Run
npm run buildto see type errors - Coverage threshold: The project requires 80% coverage (configured in Vitest)
Deployment issues on Vercel
Deployment issues on Vercel
Build fails:
- Verify
vercel.jsonis present - Check build logs for TypeScript or dependency errors
- Ensure
OPENAI_API_KEYis set in Vercel environment variables - Redeploy after adding env vars
- Check Function logs in Vercel dashboard
For additional help, check the GitHub Issues or ask in the AI chat assistant using the deployed site at deepwiki.com.