this post was submitted on 02 Sep 2023
128 points (97.8% liked)

Rust

5980 readers
79 users here now

Welcome to the Rust community! This is a place to discuss about the Rust programming language.

Wormhole

[email protected]

Credits

  • The icon is a modified version of the official rust logo (changing the colors to a gradient and black background)

founded 1 year ago
MODERATORS
 

For a larger project requiring the ability to work with my computer in 3D, I developed a 3D terminal emulator. It uses the wonderful alacritty_terminal crate from the Alacritty terminal for VTE parsing and PTY initialization, and it uses wgpu and rend3 for the rendering code. The actual text rendering is done using multichannel SDF rendering, generated lazily at runtime to a texture atlas using the msdfgen crate.

A project like this would've been impossible if not for Rust's friendly library ecosystem and the Alacritty project's efforts to make their core terminal emulator code reusable for other means.

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 15 points 1 year ago (2 children)

Looks nice, but I'm more interested in what project would make this useful. I can only think of VR stuff, but then you'd probably use a VR framework for it.

[–] [email protected] 22 points 1 year ago

Imagine being stuck in VR Vim.

[–] [email protected] 13 points 1 year ago (1 children)

It is a VR project! I needed fast, highly-legible text from a variety of viewing angles and MSDFs fit the bill.

I'm curious, when you say a VR framework, do you mean something like Stereokit?

[–] [email protected] 3 points 1 year ago (1 children)

I don't actually know anything about VR development, I just assumed that you'd be working with something like a game engine to handle the play area, controls, stereo 3D rendering and generally as a platform for using multiple applications in one headset.

Stereokit looks like it's exactly that. I'm wondering, how do you solve those things if you're not using a framework?

[–] [email protected] 8 points 1 year ago (1 children)
cargo add blood
cargo add sweat
cargo add tears

The go-to API to use these days for VR (or XR more broadly) is OpenXR, which is a standard for AR/VR developed by the Khronos Group, who also standardized OpenGL and Vulkan. OpenVR is an older, SteamVR-specific VR library that has been phased out in favor of OpenXR. Oculus also had its own Oculus SDK for developing VR apps for both desktop VR and Quest that has also been deprecated for OpenXR.

When you initialize OpenXR, whichever API loader you've instantiated connects to its compositor, which is typically a daemon process. The OpenXR API has extensions for whether you're rendering using OpenGL, Vulkan, DirectX, whatever, and then you hook into the compositor with your graphics-specific extension to exchange GPU objects for compositing.

OpenXR also handles frame synchronization (which is a tricky subject in VR because of how tight latency needs to be to not give the user motion sickness) and input handling, which is compositor- and hardware-specific.

For Rust's purposes, you can find an example of how to plumb OpenXR's low-level graphics to wgpu here: https://github.com/philpax/wgpu-openxr-example

Once you can follow OpenXR's rules for graphics and synchronization correctly, you're off to the races, and there isn't much of a difference, at least on the lowest level, between your XR runtime and any other game engine or simulator.

Stereokit is really cool because it builds on OpenXR with a bunch of useful features like a UI system and virtual hands rendered where your controllers are. It also supports desktop mode.

If you're interested in this kind of stuff I highly recommend reading the OpenXR specification. Despite it being made by the same consortium that wrote the OpenGL and Vulkan specs, the OpenXR spec is highly-readable and is a really good introduction to low-level VR software engineering.

[–] [email protected] 2 points 1 year ago

Bro thanks for the intro to vr dev, this is really cool stuff

[–] [email protected] 13 points 1 year ago

After seeing the screenshot and not looking at the community first, I thought you had made a terminal emulator in the game, and not the language.

[–] [email protected] 9 points 1 year ago (1 children)

Where's the code? I'm sure everyone interested in VR development could benefit

[–] [email protected] 5 points 1 year ago

Thank you for the encouragement! Unfortunately, I'm not able to share the source code just yet, but I do intend to license it under the Apache license as soon as I can. Please refer to this comment for more info: https://programming.dev/comment/2568603

[–] [email protected] 4 points 1 year ago

Isn't that the Skybox from Learnopengl.com ? Recognized it almost immediately

[–] [email protected] 4 points 1 year ago (1 children)

Are you making a vr desktop/wm?

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago) (1 children)

Almost! I'm developing an experimental actor-based VR runtime geared towards supporting hot-reload as a first-class development workflow. The idea is that by letting all functionality, executed as WebAssembly, be added, reloaded, or removed at runtime, it should eventually become possible to develop the space entirely within the space, i.e. without ever needing to remove your VR headset.

A preliminary requirement to that is some "bootstrapping" environment that doesn't need the space to have pre-existing VR tooling for writing new behavior and compiling to WebAssembly. I decided that a terminal emulator had the highest flexibility to time investment ratio based on my experiences building another prototype terminal emulator using alacritty_terminal.

Chances are that the native terminal emulator will come in handy over the entire lifetime of my project's development, given that I and the other contributors are regular modal editor users and from what I know there are currently no self-hosting WebAssembly compilers that could be invoked without some kind of access to the native OS.

[–] [email protected] 3 points 1 year ago (1 children)

Sounds like a pretty amazing project tbh. Any chance you will show that off or demo at any point? Is it an open project or are you planning to sell it or is it just for internal use?

[–] [email protected] 3 points 1 year ago (1 children)

I plan on publishing the majority of my project's source code under the Affero GPL v3, but this terminal emulator and other associated projects that could be made independent from my project will be published under Apache.

This is more of a hobby/passion project for me and the other contributors than anything else, so our primary goal is just to make something that's functional and usable to everyone. We don't plan on ever making a profit off of this unless we get the opportunity to keep the copyleft license while doing so. We hope that the AGPL license always lets free software enthusiasts have the opportunity to use our software over existing VR tech stacks like Unity, VRChat, or Neos, which are all non-freedom-respecting, non-self-hosting, and depending on who you ask, very poorly moderated.

However, right now, the project is so early-stage that I'm not comfortable seriously publicly promoting it just yet. Our documentation is currently in a state of disarray and we're trying to find a way to advance the project in our very limited free time.

I've love to share the source code, but we can't just yet. Hold on tight!

[–] [email protected] 2 points 1 year ago

I can’t speak for others but I am saddened by the current state of VR. I was hopeful we’d have all kinds of desktop environments and applications to go with it. I break out my HMD every so often and I’m still holding out hope something like Monado can provide a good experience. I imagine booting my headless machine with only my tethered HMD but I guess that’s a pipedream haha.

Anyway thanks for sharing and keep us posted 👍

[–] [email protected] 2 points 1 year ago

This is useful enough for me to want to use it now. Any way that we can? I likebto write stuff in vim and this seems immersive.