Abyss
For this chapter we’ll clean up the multiplayer experience.
After publishing the last one, I ended up debugging for a good solid few hours on how to make it work.
Finally I discovered it was an issue within the library:
// y-codemirror.next/sync.js
class YSyncPluginValue {
constructor(view) {
this.view = view
this.conf = view.state.facet(ySyncFacet)
this._observer = (event, tr) => {
if (tr.origin !== this.conf) {
const delta = event.delta
const changes = []
let pos = 0
for (let i = 0; i < delta.length; i++) {
const d = delta[i]
if (d.insert != null) {
changes.push({ from: pos, to: pos, insert: d.insert })
} else if (d.delete != null) {
changes.push({ from: pos, to: pos + d.delete, insert: "" })
pos += d.delete
} else {
pos += d.retain
}
}
view.dispatch({ changes, annotations: [ySyncAnnotation.of(this.conf)] })
}
}
this._ytext = this.conf.ytext
this._ytext.observe(this._observer)
}
// ...
}
YJS is in the middle of a breaking change upgrade from 13
-> 14
. The Phoenix channel adapter we’re using is already on the next “unreleased” branch of 14
.
The issue was that the delta
, in the 14
version, has a new shape. We need to access the ops
property to actually get the changes happening on either tab.
So it’s a one-line change:
- const delta = event.delta
+ const delta = event.delta.ops
I looked into “publishing” it or forking it and applying the change to stick into the dependencies or whatever but I found it easier to just vendor the whole library in and apply this change.
So, now the multiplayer experience is actually working!

Cool. I think this is only the beginning though, in terms of integration.
This was achieved using a “blank” buffer. It was not attached to any file.
Hours researching
One thing I was curious about changing was how selection highlighting works to mimic VSCode. But it’s something for later.
Anyway, we need to begin wiring the multiplayer experience into the actual files we’re hosting. I think it just means replacing all of the FileBuffer
s with the YJS buffer instead.
Hours researching, two days break
I got sidetracked on whether it’s smart to adopt this YJS dependency. One issue with using NIFs in elixir (native implemented functions) is the potential of crashing the BEAM virtual machine or blocking the process scheduler.
Technically there is a way to implement it ourselves… but this is another example of premature optimization. If we end up running into too many issues with YJS then we can replace it with our own. However painful that may be.
Anyway, let’s wire up the buffers to actual files.
Day break
So now that we’re working with Channels, we can scrap all of that prior work we did in how we managed files. Oh well.
I think we can still hold onto the endpoints we made, perhaps, maybe not. We’ll see.
Anyway, look at how much simpler the FileBuffer
type looks:
export type FileBuffer = {
path: Path
+ ydoc: Y.Doc
- savedContent: string | null
- unsavedContent: string | null
isDirty: boolean
status: "unloaded" | "fetching" | "loaded" | "failed" | "creating"
}
I don’t think we’ll need isDirty
or status
, necessarily, depending.
Hours later
So I realized the best way to go about this is to just attach an editor state to each file buffer.
It looks like EditorState of codemirror is immutable, so I guess instead of attaching we can recreate it each time.
hours more
The whole reason I’m gaslighting myself into using React is on the offchance I ever want to make a mobile app.
Otherwise its lifecycle is so obtuse and annoying to get around I don’t see the point in dealing with it. Hopefully though after this is completed it’ll be fine.
hour or so
So I ditched the codemirror-react library because I couldn’t access the actual codemirror state easily.
This makes a lot more sense:
export default function EditorPane() {
const [container, setContainer] = useState<HTMLDivElement | null>(null)
const [view, setView] = useState<EditorView>()
const handleRef = useCallback((node: HTMLDivElement | null) => {
if (node !== null) setContainer(node)
}, [])
const activeBuffer = useEditorStore((state) =>
state.activePath ? state.buffers[state.activePath] : null
)
useEffect(() => {
if (!container || !activeBuffer) return
if (!view) {
const initView = new EditorView({
state: createEditorState(activeBuffer),
parent: container,
})
setView(initView)
} else {
view.setState(createEditorState(activeBuffer))
}
return () => {
view?.destroy()
setView(undefined)
}
}, [container, activeBuffer?.path])
return (
<div className="col-span-10 h-screen cm-theme-dark" ref={handleRef}>
{!activeBuffer && <div className="p-4 text-gray-400">Select a file to get started.</div>}
</div>
)
}
So the reason I had to do it this way was because when we attach the collaboration plugins they’re more tangled in than other plugins. The react-codemirror way to do things was to just change the extensions array, but the collaboration plugin wasn’t completely disconnecting, causing overwrites on differing files.
Following the codemirror documentation, we want to use view.setState
in these cases:
(method)
EditorView.setState(newState: EditorState): void
Reset the view to the given state. (This will cause the entire document to be redrawn and all view plugins to be reinitialized, so you should probably only use it when the new state isn’t derived from the old state. Otherwise, use
dispatch
instead.)
I’ve been giving serious debate on replacing YJS because of its complexity… but maybe another time.
The final thing that needs to happen is to actually load the file and then save it through the sockets instead of the api.
Before doing that, we need to improve our auth flow in the websocket. That required a few modifications.
# lib/project_web/channels/user_socket.ex
def connect(_params, socket, connect_info) do
with %{"user_token" => ut} <- connect_info.session,
{user, _token_inserted_at} <- Accounts.get_user_by_session_token(ut) do
{:ok, assign(socket, current_scope: Scope.for_user(user))}
else
_ -> {:error, :unauthorized}
end
end
We now store the current scope of the user, passed into the assign
. With that, we can now handle authentication in our channel.
# lib/project_web/channels/doc_channel.ex
@impl true
def join("site_docs:" <> rest, _payload, socket) do
with [subdomain, doc_name] <- extract_subdomain(rest),
{:ok, socket} <- authorized?(socket, subdomain, doc_name),
{:ok, doc_pid} <- start_shared_doc(subdomain, doc_name) do
Process.monitor(doc_pid)
SharedDoc.observe(doc_pid)
{:ok, assign(socket, doc_name: doc_name, doc_pid: doc_pid)}
end
end
Now we’re entering into esoteric Elixir land, but just think of each module vaguely as its own process and things start to make vaguely a bit more sense.
Anyway, I think we need to change the site file structure eventually. It’ll be okay.
Few days breaks
I keep getting sucked into architecture decisions. I think I FINALLY have it all figured out, which is great. Did some researching more caching strategies, but again, in a way, this is all premature optimization.
I was also researching keybindings in codemirror. They have an emacs package; I wonder if anyone would want vim supported too.
Another example of the package being wrong though… they had /*@__PURE__*/
annotations which stripped the commands actually working! Had to debug it for awhile before removing those annotations.
We’ll probably have to create an org on github and fork all of these things… once I buy the domain, of course.
Anyway, let’s finally actually save content.
To do so we can create a new handle_in
in the DocChannel
.
Day later
So in order for this to work we’ll need to switch to file ids rather than their names. Because soon we need to be able to support a file tree and renaming files and stuff: the only thing that can remain static is the ID.
Indeterminate time pass
Honestly, there’s so much I have to keep changing around as I figure out what needs to be done that it’s hard to state where to begin. Usually programming logs or pseudo-books have this smug underlying gist that the author knows what they’re doing and handholding you along. Such smugness was abandoned a long time ago…
In any case, where did we leave off? There’s a new channel to handle more “global” stuff during an editing session. The file system is now more “tree-like” and we can resize things.
The “store” on the frontend is putrid right now though. Still not sure what’s the exact lifecycle…
After that we need to get preview working, I think. Then we need to reindex all of the buffers by id still. Once that’s all working we need to exploring theming more.
I’ve been reading up more about collaboration lately. Not in the coding sense, but the social sense.
It’s strange to build a platform where you aren’t even sure you’ll collaborate or use that feature.
But when I see cathedrals I do wonder and it’s a partial shame, the current state of the game. If at least two users end up collaborating on this platform, then I’d say that’s worth it.
Well, anyway, I’m going to go switch out all uses of path
for files and instead make them ids
.
I think I’m finally going to merge Storage
into all of the Hosting
functions so it’s just one call, too.
Hours later
Okay, almost done! What did I get sidetracked on this time…
I wasn’t liking the memory footprint. Having a channel process opened for every document seems a little much. There is a way to restrict it all to one channel, but we’ll explore that later. We’d still have to correlate these updates:
# lib/project_web/channels/doc_channel.ex
@impl true
def handle_in("yjs_sync", {:binary, chunk}, socket) do
SharedDoc.start_sync(socket.assigns.doc_pid, chunk)
{:noreply, socket}
end
@impl true
def handle_in("yjs", {:binary, chunk}, socket) do
SharedDoc.send_yjs_message(socket.assigns.doc_pid, chunk)
{:noreply, socket}
end
to the assigns.doc_pid
. Somehow we’d have to find a way to fit any ID correlation somehow, probably within the event itself, like yjs:#{file_id}
. Then it’d be a simple map of file_id -> doc_pid}
. But that’d require editing the channel library. There are a lot of interesting optimizations to do when the memory on the server gets too bloated.
Anyway, we now moved over to channels to create and delete files. I don’t think I’ve shown much of the “store” file which holds all the functions, but I’ll walk through how a create/delete works.
So when one opens the input on creating a file:
function FilenameEdit({ onCreate, onCancel }: FilenameEditProps) {
const [filename, setFilename] = useState("")
const handleCreate = () => {
const path = filename.trim()
if (path) {
onCreate(path)
setFilename("")
}
}
return (
<div className="ml-3 flex items-center gap-x-1 text-xs text-slate-400">
<FileIcon className="size-3.5 min-w-3.5" />
<input
value={filename}
onChange={(e) => setFilename(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter") handleCreate()
if (e.key === "Escape") onCancel()
}}
autoFocus
className="bg-transparent border-none outline-none text-slate-200"
placeholder="new-file.html"
/>
</div>
)
}
On Enter
we call handleCreate
, or our store function, createFile
(which we passed in as onCreate
):
// assets/js/stores/editorStore.ts
createFile: async (path) => {
const exists = Object.values(get().buffers).some((buf) => buf.path === path)
if (exists) {
throw new Error(`File already exists at path: ${path}`)
}
let metaChannel = get().getMetaChannel()
metaChannel.push("file:create", { path }).receive("ok", async (file: SiteFile) => {
set((state) => ({
buffers: {
...state.buffers,
[file.id]: toFileBuffer(file),
},
}))
await get().preloadFile(file.id)
set((state) => ({
openOrder: [...state.openOrder, file.id],
activeFileId: file.id,
}))
})
},
Since we changed the index of state.buffers
to instead go by file id, we have to iterate all of buffer paths to check for conflicts. Ideally we can do a little warning sign like “this already exists” before hitting enter later.
Past that we now call into the “meta” channel, or the “site meta” aka site settings, file management, permissions (in the future) and etc. The document channel is purely for editing.
The getMetaChannel
is a “lazy singleton” which means it’s booted up if not already booted up. We then attach channel event handlers. I’m only showing the file:create
event and omitted the others.
getMetaChannel: () => {
let metaChannel = get().metaChannel
if (metaChannel) return metaChannel
metaChannel = get().getSocket().channel(`site_meta:${get().subdomain}`)
metaChannel
.join()
.receive("ok", () => console.log("Joined meta channel"))
.receive("error", (err) => console.error("Join failed", err))
metaChannel.on("file:create", (file: SiteFile) => {
set((state) => ({
buffers: {
...state.buffers,
[file.id]: toFileBuffer(file),
},
}))
})
// ...
set({ metaChannel })
return metaChannel
},
Going back to our create function, we push an event to the server:
metaChannel.push("file:create", { path })
The server receives this, tries to create a file, and then lets everyone else know working on the site that a file was created:
# lib/project_web/channels/meta_channel.ex
@impl true
def handle_in("file:create", %{"path" => path}, socket) do
scope = socket.assigns.current_scope
case Hosting.create_site_file(scope, path) do
{:ok, file} ->
broadcast_from!(socket, "file:create", file)
{:reply, {:ok, file}, socket}
{:error, reason} ->
{:reply, {:error, reason}, socket}
end
end
The broadcast_from
avoids firing the event back to the socket which called it, a.k.a the person who actually created the file. For them we want to open that file on the screen, so we have custom handling for them. When a file is created, their buffer is set to the file they just created.
One thing to note is that the broadcast_from
can use a different event name if so desired. e.g. instead of file:create
it could be file:created
or anything else if you want to differentiate it from the “push” event of file:create
. I find it easier to keep the same names to make sense of things.
A lot of this code can be greatly organized, but let’s just keep pushing ahead.
Let’s see…
The idea here is to make an editor palpable enough for me to use it, otherwise how could I expect others to use it. So I think the few things left would be easy jump-to-file with a command palette, then drag and drop in the file explorer.
I personally never use multiple panes; it’s too distracting. The only advantage I could see is if you want to edit css and the html side by side. I don’t know. I’ll give it some thought if there are enough requests to support multiple editor panels.
I also never use tabs; it’s easier to jump between everything. But I do think it’s nice to see which files you have open and to click between them, depending. Maybe later.
There are a lot of “nice” things to implement, but they aren’t required, not even the command palette (just yet). So I think what’s next is either preview, or just start building the site.
I’ll add the preview later when switching tabs becomes too annoying.
Okay, back to making siqu.neocities.org
.
Iterum incipere: siqu.neocities.org
Now that we have collaboration working, let’s continue.
I think one other thing we can fix right now is change language modes depending on the file. Right now everything defaults to javascript mode.
// assets/js/components/EditorPane.tsx
if (buffer.path.endsWith("js")) extensions.push(javascript({ jsx: true }))
if (buffer.path.endsWith("html"))
extensions.push(html({ matchClosingTags: true, autoCloseTags: true }))
Cool, now the HTML autocomplete is working. It looks like it automatically sets up the javascript and css contexts too, for style and script tags.
May as well add in css support too:
if (buffer.path.endsWith("css")) extensions.push(css())
I think it’s okay to rely on the extension to determine the file type, but we could also use the mime_type
. I can’t really think of a reason to do so at the moment. The main reason why we have mime_type
on the SiteFile
model is for indexing, potentially, also file counts, things like that.
Okay, so to recreate siqu.neocities.org
we’re going to need to do a few things.
Few hours later
I was researching alternatives on how to handle CSS generation, but let’s just use the CDN and not do that.
Okay, anyway, here’s what the index page of the static site generator for this website looks like.
<Layout title={SITE_TITLE} description={SITE_DESCRIPTION}>
<main class="grid grid-cols-12 max-w-[1700px] mx-auto grid-flow-row-dense gap-5 p-5">
<div class="hidden sm:block col-span-12 sm:col-span-6">
<TitleImage wordCount={wordCount(posts)} />
<MainImage />
</div>
<div class="col-span-12 sm:col-span-6">
<div class="h-[970px] rounded-lg overflow-scroll">
<details
open
class="flex flex-col items-start bg-[#1a242f] bg-background-alt p-[10px] pt-[10px] pb-0 rounded-lg overflow-hidden"
>
<summary class="no-underline flex hover:no-underline focus:no-underline">
她的仁慈也變成我的了。 <span class="inline-block">⏳→ 老</span>
<div class="ml-auto inline-flex gap-x-2">
<span class="text-slate-500" id="sort-latest">latest</span>
<span id="sort-popular">popular</span>
<span id="sort-last-viewed">last viewed</span>
</div>
</summary>
<ul id="posts-container" class="flex px-0 list list-none justify-between flex-wrap gap-2">
{
posts.map((post, i) => (
<li data-default-order={1_000_000 - i} data-hits={0} data-last-hit={0}>
<a
class="post font-mono w-full text-slate-200 no-underline"
href={`/posts/${post.id}/`}
>
<h3 class="text-xl flex items-center">
{post.data.title}
{post.data.legacy && "⏳"}
<span class="ml-auto text-lg font-thin">
<FormattedDate date={post.data.date} />
</span>
</h3>
<p class="!text-base">
{post.body
?.replace(/<[^>]+>/g, "")
.split(" ")
.slice(0, 40)
.join(" ")}
...
</p>
</a>
</li>
))
}
</ul>
</details>
</div>
</div>
</main>
</Layout>
So unfortunately I cannot just copy it over to index.html
and be done with it… because there’s no JS runtime.
We may not have a runtime… but what if we introduced a runtime?
The Wildcard
What sucks about most static website content is that, past a certain point, it becomes unmanageable if you’re dealing purely with HTML.
We only have html pages, but maybe we have shared elements on each page, whether header or footer… and you can try to solve this with iframes, but depending on how the element is composed, it’d be hard to update it. You can usually load posts through iframes, but that’s it. I think the key thing here though is composability. Nesting iframes within iframes within iframes isn’t that practical.
Anyway, so you migrate to a static site generator. And it works. But I cannot stand to update this website, because generators feel so strangely rigid. It’s not that fun to build on static site generators… I don’t know why. I guess it feels heavy. Whatever the case, the amount of nonsense needed to get a static generator all working sucks the fun out of building, mostly.
But what if we instead generated on the fly? There’s no manual build step. It’s automatic.
This is the wildcard I mentioned awhile back.
Now, there are still a few decisions to make… but for starters, let’s just get it working.
First, let’s grab the lua library.
# mix.exs
defp deps do
[
# ...
{:lua, "~> 0.3.0"}
]
end
mix deps.get
Okay… now let’s see…
Hours of fiddling
So here’s the shape thus far.
First, let’s actually create an index.lua
file. We’ll need to figure out how to handle conflicts (e.g. if there’s an index.html
and index.lua
and we’ll probably prioritize index.lua
but for another day).
And then load it up with some code in the editor. Still figuring out the public facing API, but I think something like this works for now:
local index = {}
index.test = 4
index.render = function()
return "<div>hello " .. index.test .. "</div>"
end
return index
Each lua file needs to return a module that implements a certain shape. In this case, it needs to have a render
function attached to it.
That’s saved. We need to make sure we can fetch index.lua
:
# lib/project/hosting.ex
def get_site_file_by_path(%Scope{} = scope, path) do
ext = [".lua"]
where =
case Path.extname(path) do
"" -> dynamic([sf], sf.path in ^Enum.map(ext, &Path.join(path, "index#{&1}")))
_ -> dynamic([sf], sf.path == ^path)
end
file =
from(sf in SiteFile,
where: sf.site_id == ^scope.site.id,
where: ^where
)
|> Repo.all()
|> List.first()
{:ok, contents} = Project.Storage.load(scope, file.path)
{:ok, file, contents}
end
This is one real cool aspect of Ecto, the sql query builder for elixir: dynamic clauses. We can change the where
clause depending on whether we received a directory in the path.
So that’s working, and we can load the lua file contents.
What needs to happen is that, on request to the /
on the site, we need to grab that lua file and execute it.
So let’s add some janky proof of concept.
# lib/project_web/controllers/file_controller.ex
# ...
with # ...
{:ok, file, contents} <- Hosting.get_site_file_by_path(scope, Enum.join(path, "/")) do
if String.ends_with?(file.path, ".lua") do
result = Runtime.eval(contents)
conn
|> put_status(200)
|> put_resp_content_type("text/html")
|> send_resp(200, result)
else
# ...
So I think we can later check the mime_type
of the file instead of looking at extensions. But that’s for later.
We need to create a Runtime
module:
defmodule Project.Runtime do
def eval(code) do
Lua.new()
|> Lua.eval!(code)
end
end
Here we instantiate the Lua virtual machine and then run the code.
So it runs, but the question becomes, “how can we run the render
function the user provided?”
To do so, we need to look at the result:
{[[{"render", #Function<7.92165918/2 in :luerl.decode_luafunc/3>}, {"test", 4}]],
#Lua<>}
It gives us back the module that was made in index.lua
. To make it easier to work with, we can transform this into a Map and pattern match on it:
# lib/project/runtime.ex
defmodule Project.Runtime do
def eval(code) do
Lua.new()
|> Lua.eval!(code)
|> handle_result()
end
defp handle_result({[module], lua}) do
case Map.new(module) do
%{"render" => ref} ->
{[rendered], _lua} = Lua.call_function!(lua, ref, [])
rendered
true ->
:noop_or_error
end
end
end
I think there’s a cleaner way to go about this, but that’ll become evident the more we work with the Lua
module. Need to do some error handling too…
But if we visit test.app.localhost
we’re greeted:
<div>hello 4</div>
Now you’re probably thinking, that’s cool and all, but this:
local index = {}
index.test = 4
index.render = function()
return "<div>hello " .. index.test .. "</div>"
end
return index
Doesn’t seem very ergonomic…
And you’re right. It’s not. We need a templating engine.
Luckily there’s a cool one for lua already we can work off of, called etlua.
Here’s what that looks like:
<h1 class="header"><%= "Hello" %></h1>
<% if current_user then %>
<div class="user_panel">
Welcome back <%= current_user.name %>
</div>
<% end %>
<div class="body">
Welcome to my site
</div>
We could probably get very far using just this library. But I think we can make it more fun later on.
Four hours later
So the way etlua
was created was through a separate language that compiles to lua. If you look at the generated lua source code, it is hard to make sense of.
And it’s just not working with the luerl
implementation.
Additionally I had to patch one of the internal string builtin functions. But now it’s just not working because of strange use of load
builtin. Which apparently needs the first three arguments to be a string, reading through GitHub.
I can’t really debug it, print statements don’t work. So the answer is to make our own templating engine.
Few hours more
One more thing: the _ENV
which was added around Lua 5.2 isn’t noted as completely supported in luerl
. So all this time I’ve been calling the load
function to load in a chunk, but the variables passed in the fourth argument, the env
really, doesn’t show.
Finally the solution is to create a function which takes in all of the variables and outputs the result.
One thing I’m now worried about is how to surface lua errors to users. Maybe we need a console and a channel to update what’s happening, or at least forward some print statements that way. So that’s duly noted.
Another thing is surfacing templating errors. I think we need to actually track the cursor or maybe even make an ast… maybe. I don’t know yet. Right now we have a simple one and it works.
I think we can advance the templating beyond the classic ERB, but in due time. This is also good anyway, because even if etlua
did work, it wouldn’t have been sufficient for what I have in mind.
Anyway, here’s what we got now:
local template = [[
<h1><%= @title %></h1>
<ul>
<% for _, item in ipairs(@items) do %>
<li><%= item %></li>
<% end %>
</ul>
]]
index.render = function()
return Template.render(template, {
title = "Groceries",
items = { "Apples", "Bananas", "Carrots" }
})
end
return index
And if we refresh the page, we now execute and return the html:
<body>
<h1>Groceries</h1>
<ul>
<li>Apples</li>
<li>Bananas</li>
<li>Carrots</li>
</ul>
</body>
Here’s the rudimentary templating library, which I’ll break up and explain through each section:
local Template = {}
function Template.transform_at_variables(code)
return string.gsub(code, "@([%a_][%w_]*)", "assigns.%1")
end
This function let’s us actually handle the passed in variables. So for example that @title
is passed in under the table called assigns
(arbitrarily chosen, holds all of the “assignments” you want to make to the template).
function Template.compile(source)
local chunks = {
"return function(assigns)\n",
" local _out = {}\n"
}
What we need to do is create a lua function which returns the HTML we want from the template. To do that, we build a function up while reading the template. Then we can run load
from lua to use the function.
local cursor = 1
while true do
local start_pos, end_pos, is_expression, lua_code =
string.find(source, "<%%(=?)%s*(.-)%s*%%>", cursor)
if not start_pos then break end
We need to step through the template. The “cursor” starts at the very beginning, at 1. Lua famously starts all their indices at 1 instead of 0, which is why it’s 1. We’re going to move this “cursor” through the given template until it reaches the end, handling stuff along the way.
The esoteric string you see there is due to lua patterns. A variation of regex, that’s all. We look for an instance of <% lua_code %>
that is ahead of cursor
, which is why cursor
is passed in there.
local plain_text = string.sub(source, cursor, start_pos - 1)
if #plain_text > 0 then
table.insert(chunks, string.format(" _out[#_out+1] = [==[%s]==]\n", plain_text))
end
Wherever the cursor is at, whether at the very beginning or the previous <% %>
, everything in-between the cursor
and start position, start_pos
, is plain text. We just want to emit it. So, for example, <h1>
will be stored at that _out
index.
The [==[ ]==]
is lua’s multiline strings.
local transformed = transform_at_variables(lua_code)
This is where we can make @title
into assigns.title
, the table we’re passing into this function, which you can see defined in chunks
at the top. It would’ve been nicer to just pass it as an environment variable, but it wouldn’t work as, at least it seems, luerl
does not support _env
.
if is_expression == "=" then
table.insert(chunks, string.format(" _out[#_out+1] = (%s)\n", transformed))
else
table.insert(chunks, " " .. transformed .. "\n")
end
cursor = end_pos + 1
end
In our templating library, there’s a difference between <% %>
and <%= %>
. The latter means, “insert this into the template” while <% %>
means, “I want this literal lua code to just be a part of the runtime”. So for example <%= @title %>
outputs the string in title. In comparison, <% for i, v in ipairs(items) %>
we’re going to use as a part of the literal program.
local remaining = string.sub(source, cursor)
if #remaining > 0 then
table.insert(chunks, string.format(" _out[#_out+1] = [==[%s]==]\n", remaining))
end
table.insert(chunks, " return table.concat(_out)\nend")
return table.concat(chunks)
end
By this point we have gone through all instances of <%
so we just want to emit the rest of the template as a string.
Finally we can bind the function together. When we run:
local template = [[
<h1><%= @title %></h1>
<ul>
<% for _, item in ipairs(@items) do %>
<li><%= item %></li>
<% end %>
</ul>
]]
print(Template.compile(template))
This is what the produced function looks like:
function(assigns)
local _out = {}
_out[#_out + 1] = [==[<h1>]==]
_out[#_out + 1] = (assigns.title)
_out[#_out + 1] = [==[</h1>
<ul>
]==]
for _, item in ipairs(assigns.items) do
_out[#_out + 1] = [==[
<li>]==]
_out[#_out + 1] = (item)
_out[#_out + 1] = [==[</li>
]==]
end
_out[#_out + 1] = [==[
</ul>
]==]
return table.concat(_out)
end
The reason why it looks so weird is that the [==[
preserves newlines.
So we take that function, and then run it:
function Template.render(source, assigns)
local code = Template.compile(source)
local chunk, err = load(code, "template", "t")
if not chunk then error(err) end
local template_func = chunk()
return template_func(assigns)
end
Thus, when we run:
local template = [[
<h1><%= @title %></h1>
<ul>
<% for _, item in ipairs(@items) do %>
<li><%= item %></li>
<% end %>
</ul>
]]
Template.render(template, {
title = "Groceries",
items = { "Apples", "Bananas", "Carrots" }
})
It produces the HTML we want:
<body>
<h1>Groceries</h1>
<ul>
<li>Apples</li>
<li>Bananas</li>
<li>Carrots</li>
</ul>
</body>
This is only the beginning. I can imagine a lot of cool things we can do with Lua. Still need to figure out how to propagate errors though. We have a lot to refine when it comes to the Lua templating library.
We also need to update the editor to support Lua, too. Might have to write our own highlighter using codemirror libraries.
One thing I kept debating is whether to use Elixir to implement the templating library.
Technically we could have, and maybe so, not sure yet. It does come out of the box with parsing/ast. We’d just have to rewrite all expressions to be executed in Lua instead of Elixir. May as well keep pressing forward with our current library.
Up Next
What was accomplished in this time span? The collaboration stuff is now working nicely. We moved saving/deleting to the websocket, and spun up another channel to handle that. Some nice improvements to the file tree, adding some emacs keybindings, migrated file handling to ID. Finally added the startings of a runtime, with the beginnings of an html templating library to expand upon.
Hopefully the next chapter has more UI improvements for more screenshots compared to this time around.
There is a lot to do, but I’m thinking next chapter will be about improving the templating experience. I think I want a way to figure out how to completely “debug” a lua file purely within the online editor, and make composability easy.
Though there’s a lot more to do with the runtime, just supporting some templating is v1. Need to figure out the import story. We’ll see.