Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wasn't quite sure what kinds of deployment are supported. I clicked on the next.js link but it just redirected to the same page.


With the rush we forgot to update some of the cases in the product page, thanks for the feedback! We'll update them shortly.

Right now we are starting with Rust (with full networking with tokio) and static sites.

Node.js (allowing Next.js to be fully runnable) and Python (allowing Django and Flask) support will be coming soon!


Also curious about how Node.js support would work?

I've heard about efforts to compile SpiderMonkey to WASM to run JS, but never V8 and the entirety of Node.js. That would be super exciting if that's what you're looking to accomplish. Would love to follow along and contribute if this is part of an open source effort!

Curious if you think cold starts could be a problem with this approach? Share-nothing architecture sounds great until you realize you have to load the entirety of V8 and Node.js on every request, but maybe you've figured out some way to work around that?

Also, how do you think the lack of JIT could affect real-world JS runtime performance? Or is there a solution for that too?


How are interpreted languages possible in wasmer?


You can compile the interpreter itself (such as CPython) to Wasm/WASIX and let Wasmer do it's thing :)


What about third-party modules? Next.js depends on SWC, which is written in Rust. Will it run on wasmer?


It’s a technical challenge but we believe is fully doable.

BTW, SWC is already powered by Wasmer!


How is swc "powered by" wasmer?


SWC uses Wasmer for its plugin system to execute WASM plugins.


There are some early docs on this here: https://docs.wasmer.io/edge/learn/deployment-modes

TLDR: support for WASIX [1] enabled builds that can start a web server (Rust with axum works well, Python and PHP are coming soon), and WCGI, with which you can use all languages that currently have WASI support , by using the standard CGI interface.

We'll also add support for standard WASI with pre-opened sockets, which will add support for a bunch of additional ecosystems (like dotnet / C#), but haven't gotten around to that yet.

If you are asking more about how code is managed and executed: stateless, spawned on demand is the default. Proxy mode will re-use instances for multiple requests. Stateful/persistent instances with persistent volumes are on the horizon.

[1] https://wasix.org/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: