A Polyglot Site for a Polyglot Conference
It's time to get this project moving! Here's what we're going to do and how you can help!
As part of the process of re-establishing the Carolina Code Conference, there was one major item that didn’t get checked off of my list last year…a new website.
This little Substack blog has been doing a pretty incredible job to this point and with the short timeline to get everything ready last year, I decided that the potential benefit of a simple static conference site probably wasn’t worth the taking away the limited time that was available at the point.
Given the effectiveness and versatility of the Substack offering, including email newsletters, podcast and now even video hosting it’s even less compelling. We can easily keep distributing information right here.
So, if we are going to do a website for the Carolina Code Conference it better be worth it.
So…what does that look like?
What do we need?
There are of course, the core requirements…
Showcase the basic information about the conference: where, when and how do I get tickets?
Highlight our sponsors in the most effective to maximize their investment
Highlight our speakers and videos of their talks over the years
Provide access to news and information
These core requirements are almost entirely addressed by this Substack blog on its own, but we can definitely find better ways to showcase our sponsors.
What do we want?
But what are the compelling requirements?
It should be fun!
It should be interesting enough that you’d want to share the site with a friend. The “you gotta see this!” share factor.
It should be a reflection of the conference’s purpose.
It should provide a way for sponsors and the community to get involved.
How exactly are we going to do this?
We’re going to build a ridiculously over-complicated website crammed with as much technology as we possibly can for no real purpose other than because it’s fun, interesting and because we can.
We’re going to build a site out of multiple programming languages and then we’re going to keep adding new languages as the years go on, via contributions from community members, meetup groups or people who just want to learn a language doing something they can link to on their resume.
We’re going to let sponsors showcase their technology, where possible, by including it in our absurdly over-engineered site and then showcasing it both at the conference during the keynote and on the website itself.
And we’re going to show benchmarks because we’re programmers (mostly) and programmers love benchmarks. You know it’s true. Benchmarks are like geek drugs. We look at them. We inspect them. We scrutinize them. And…we share them.
“Is this your idea of fun Mav?” - Goose
So how do we build this crazy thing?
This is where we’ll get into the technical bits. There are 3 core parts for the initial version of this site.
A Frontend Server that will take requests from browser clients, call an API to retrieve the needed data and then return HTML responses back to the browser.
Read only APIs in multiple languages that will be called by the Frontend to retrieve the data for the request. We should be able to add any language we want.
A Database that the Read only API will connect to in order to query the data for the response.
Here’s an example workflow, assuming we have Read only APIs available in two languages, like Go and Rust for example.
A user visits the site to see our 2024 speakers…
GET /speakers?year=2024 OR /speakers/2024
The Frontend Server will…
Receive the request, parse and validate the URL
See that we need to get the
Check which API endpoints are available to handles
Find endpoints available for Rust and Go, randomly selecting the Rust endpoint
Record the Request timestamp
Call the Rust API endpoint
The Read only API server will…
Parse the request
Connect to the database and query speakers for the year 2024
Return the response
The Frontend Server will…
Record the Response timestamp
Log the total request time for the Rust API to some type of metric collector
Render the HTML template for
/speakers/2024back to the browser with a little note that it was served by Rust in 23ms.
This is the simple overview of how things will work. We need to ensure that the API follows a consistent structure so that the Frontend server can switch from language to language. On the database side, we’ll need to define the schema and query expectations for each API interface to to share.
Deploying the Beast
This gets even more fun. The simplest possible way to do this would be to just spin up a virtual server with everything on it, then call it a day. But where’s the fun in that?
Part of the fun of this is the learning experience for everybody. Knowing that, even the deployment should be a learning opportunity.
Why not put each API in a Docker container to deploy out to a host. We could deploy anywhere, whether to a Kubernetes instance or some other new tooling that captures our attention. We could potentially deploy to a hosting company’s infrastructure if they wanted to get involved as a sponsor too.
As for the database, I could just spin up a PostgreSQL instance but there are a lot of database providers out there. Maybe one of them is interested in being highlighted as a sponsor. Maybe we use a different database entirely?
The frontend server should probably be written in Elixir/Erlang, simply because the BEAM is ideal for that type of load balancing workload, holding many parallel connections open while waiting for responses.
And of course, to manage and test all of these different language containers and deployments we’re going to need to automate the entire thing with version control and CI/CD which should also create a great opportunity for an interested sponsor.
What does an MVP look like?
The absolute minimum viable product for us to launch with will require the frontend server, the initial HTML/CSS for the site, a database of speakers and sponsors and read only API’s for that database in at least 2 languages…plus somewhere to deploy it all.
Ideally, we also need a good place to track performance stats which could be in the database but it could also be a good opportunity for a sponsor to showcase some time-series data tracking functionality.
We can grow the system year over year with new languages, new tools, and more polish.
So how do we make this happen?
I need volunteers who want to get involved! If you’d like to take part in this project somehow, send an email to firstname.lastname@example.org with a subject of “Polyglot Site” to express your interest.
If there’s something you’re learning that you’d like to use on this project, let me know. You will not hear “keep it simple” from me. You’re only going to hear “Cool! Let’s see how we can make it work.”
Trying to learn K8s, Docker or Podman? Let me know.
Learning a new programming language and want to create an API interface with it? Let me know.
Do you run a meetup group that wants to show off its technology in here? Let me know.
Do you run a company that’s interested in sponsoring some piece of the system as a database, infrastructure, security, CI/CD or other type of provider? Let me know. You may also want to look at the Call for Sponsors.
We’ll make sure that everybody who’s involved, individual, meetup or company is well documented on the website as long as it’s available. If you’re a conference sponsor contributing your technology to make this work, we’ll also highlight your product in depth during the opening talk of the conference.
For those interested, reach out as described above. We’ll schedule an initial call to plan, discuss architecture and requirements and then go from there!
I look forward to hearing from you!
Subscribe now to be the first to know about new developments around the conference!