Vibe Coded: AI Who’s Who

2 months ago 34

Over the weekend I successfully completed my eighth vibe-coded project, and I’m really excited about it! It’s called the “AI Cultural Canon,” and it’s basically a Firebase-powered front-end website for 27 people that arguably comprise a “Who’s Who” list in the world of AI / artificial intelligence. Visitors can read a short bio, see a […]

Over the weekend I successfully completed my eighth vibe-coded project, and I’m really excited about it! It’s called the “AI Cultural Canon,” and it’s basically a Firebase-powered front-end website for 27 people that arguably comprise a “Who’s Who” list in the world of AI / artificial intelligence. Visitors can read a short bio, see a photo, watch a related YouTube video, and link to the person’s English WikiPedia article from the simple web interface. I named it “AI Cultural Canon” as a nod to the term and concept of “cultural literacy,” which I learned about years ago reading E.D. Hirsch‘s 1987 book with the same name.

Check this project out with these links:

This project builds on my earlier Google Sites Firebase Comment Widget experiment, detailed in my November 4th blog post, and feels like a significant next step in my journey back into database-powered web development, now with AI as an imperfect but extremely helpful coding partner.

In this post I want to do three things:

  1. Push back on some misconceptions I keep hearing about “vibe coding.”
  2. Share what this new project does and the prospects it suggests may be possible with students.
  3. Connect the dots back to my early 2000s FileMaker Pro days and why this all matters for my teaching and for media literacy.

My Vibe Coding Journey to Firebase (CC BY 4.0) by Wesley Fryer

Vibe Coding Is Not a Magic Button

One misconception I see a lot is that “coding with AI” is basically pushing a button and getting a finished app. That has not been my experience.

For this project:

  • I spent multiple hours going back and forth with both ChatGPT 5.1 Thinking and the latest Google Gemini model. (All of my ChatGPT conversations (but not those with Gemin) are detailed on my AI documentation Google Doc.)
  • Our conversation ran to hundreds of lines of prompts, code revisions, bug reports, and “wait, that broke something else…” moments.
  • I had to debug Firebase rules, tweak HTML/CSS layout, refactor how config is handled for security, and iterate the admin UI multiple times.

Yes, the AI wrote basically ALL of this actual code. However:

  • I had to steer the architecture (What functionality did I want? How should the UI look? How should an admin page look and function? etc.)
  • I had to notice when something was broken or “good enough but not actually what I asked for.”
  • I had to keep track of security, data structure, and deployment details that were very easy to lose in the middle of a long chat.

So vibe coding, at least for me, is not “type one sentence, get a working app.” It’s much closer to:

A very intense, collaborative code jam with a robot partner who never gets tired, sometimes forgets what you just decided, and will happily write a hundred lines of code in the wrong direction if you’re not paying attention.


But Vibe Coding Is Absolutely Transformative

All that said: there is no way I could have built this project on my own without AI assistance.

The AI Cultural Canon app:

  • Uses Firebase Firestore as a cloud database.
  • Has a public explorer page where students can filter by category, click into individual people, and see:
    • A short bio (in Markdown),
    • Optional image,
    • Optional YouTube video,
    • DOB / DOD and a link to their Wikipedia article.
  • Includes a password-protected admin dashboard where a teacher can:
    • Add / edit categories,
    • Add / edit people entries,
    • Manage tags, links, and bios.
  • Includes a seeding script so I (or another teacher) can quickly pre-load the database with example entries.

Under the hood that’s a lot of moving parts: Firestore collections and rules, authentication, modular JavaScript imports, markdown parsing, and a simple UI that still works when embedded inside a Google Site.

I’ve long believed that database-enabled web apps are one of the most powerful things we can create on the web. In the late 1990s and early 2000s I was building web-enabled databases with FileMaker Pro, and those solutions were incredibly useful in a college administrative setting. But modern stacks: frameworks, security concerns, cloud hosting, have a much steeper learning curve than “click a FileMaker web-publish button.”

Vibe coding changes that equation. It doesn’t make the work simple, but it makes it possible for me, as a full-time teacher, to build and ship real, data-driven web apps again.


Building on the Google Sites Firebase Comment Widget

This AI Cultural Canon builds directly on my previous project:

Google Sites Firebase Comment Widget
https://github.com/wfryer/google-sites-firebase-comment-widget

That comment widget was my first project using Firebase as a backend. It taught me:

  • How to set up a Firebase project and web app
  • How to use Firestore for simple create/read/update operations
  • How to embed a custom widget in a Google Site with <iframe>s
  • How to keep Firebase config usable but reasonably safe in a public GitHub repo

The AI Cultural Canon takes those same ideas and expands them:

  • Instead of a single comments collection, I now have two structured collections: categories and people.
  • Instead of an invisible backend, there’s a teacher-friendly admin UI for managing records.
  • Instead of “just get it running,” I now have documented Firestore security rules that keep the public site read-only and limit writes to authenticated users.

From a learning standpoint, this project feels like “Level 2” of my Firebase journey. And it’s giving me more confidence that I can keep going: more collections, richer relationships, perhaps even multi-user workflows down the road.


Why This Matters for Teaching and Media Literacy

By day I’m a middle school web design teacher and I’ve been building websites since 1996. I care a lot about:

  • Helping students understand how the web actually works, not just how to click inside walled gardens.
  • Showing them that they can build things that store and present information, not just static brochure pages.
  • Connecting web design to media literacy: Who gets represented? How is information presented? Who curates the information? Who is the intended audience? What seems to be the purpose of the presented media?

A project like AI Cultural Canon hits all of those notes:

  • It’s a curated, opinionated dataset: Which AI “voices” are we lifting up? What bios do we write? What sources do we link?
  • It’s a concrete example of a database-backed website that students can see, click through, and potentially help expand.
  • It’s a template I can re-use: students could build their own “canons” of scientists, artists, historical figures, or local community leaders.

I’m already thinking about how to turn this into a lesson or mini-unit:

  • Introduce students to the public explorer and talk about representation and selection.
  • Show them the admin view and the underlying data model.
  • Have them sketch or co-design their own canon projects.
  • Potentially, in an upper-level class, invite them into the vibe-coding process itself.

Looking Ahead

If you’d like to explore the code, everything is on GitHub:

This is my eighth successful vibe-coded project, and it definitely won’t be the last. The combination of:

  • long-form, back-and-forth conversations with tools like ChatGPT and Gemini,
  • my own mental model of web development and pedagogy, and
  • the power of platforms like Firebase

is opening doors I honestly thought were closed to me as a classroom teacher with limited time.

If you end up forking the repo, adapting it for your own canon, or using it with students, I’d love to hear how it goes.


View Entire Post

Read Entire Article