V was created because none of the existing languages had all of the following features:
Fast compilation | D, Go, Delphi |
Simplicity & maintainability | Go |
Great performance on par with C and zero cost C interop | C, C++, D, Delphi, Rust |
Safety (immutability, no null, option types, free from data races) | Rust |
Easy concurrency | Go |
Easy cross compilation | Go |
Compile time code generation | D |
Small compiler with zero dependencies | - |
No global state | - |
Hot code reloading | C# (.NET 6+), Dart |
Initially I was going to compare V to all major languages, but it got repetitive pretty quickly.
The table above and the list of the features on the home page should give you a pretty good picture.
For example, it's pretty obvious that compared to C++, V is much simpler. It offers significantly faster compilation speed, safety, lack of undefined behavior (wip, e.g. overflowing can still result in UB), easy concurrency, compile time code generation, etc.
Compared to Python, it's much faster, simpler, safer, more maintainable, etc.
You can use this formula for any language.
Syntax comparison:
Since V is very similar to Go, and its domain is similar to Rust's, I left a comparison with these two languages.
V is very similar to Go, and these are the things it improves upon:
— No err != nil
checks (replaced by result types)
— No variable shadowing
— Immutability by default
— Enums
— Sum types (type Expr = IfExpr | StringLiteral | IntLiteral | ...
)
— String interpolation: println('${foo}: ${bar.baz}')
— If and match expressions (including sum type matches)
— No global state (globals can be enabled for low level applications like kernels via a command line flag)
— A simple way to check whether an array contains an element: if elem in arr {
.
— Only one declaration style: a := 0
, therefore no uninitialized variables.
— Warnings for unused imports and vars for quicker development without annoying
interruptions. But only in development/debugging mode.
Making a production build still requires fixing all of them, thus enforcing
clean code.
— filter/map/reduce
methods for arrays and maps.
— Much smaller runtime
— Much smaller binaries (a simple web server written in V is ~600 KB vs ~7 MB in Go)
— Zero cost C interop
— GC is optional
— Much faster serialization using codegen and no runtime reflection
— Simpler local modules: `import internal.css_lexer ` instead of `import "github.com/evanw/esbuild/internal/css_lexer"`
— Precompiled text and HTML templates unlike Go's html/templates that have to be parsed on every request (or pre-cached and executed on every request) and have to be deployed with the app's binary.
— Fearless concurrency (no data race guarantee at compilation) wip
— No null (null is only allowed in unsafe code)
— Stricter vfmt to ensure one coding style
— Centralized package manager: vpm.vlang.io (v install ...
)
— Much simpler and less verbose testing, assert
.
— Primitive types can have methods resulting in less verbose code: strings.Replace(strings.Replace(s, "a", "A", -1), "b", "B", -1)
=>
s.replace('a', 'A').replace('b', 'B')
- Arrays and maps (and arrays of arrays, arrays of maps etc) are automatically allocated. No more nil reference panics if you forgot to allocate each map in a loop.
It is a complex language with a growing set of features and a steep learning curve. No doubt, once you learn and understand the language, it becomes a very powerful tool for developing safe, fast, and stable software. But the complexity is still there.
V's goal is to allow building maintainable and predictable software. That's why the language is so simple and maybe even boring for some. The good thing is, you can jump into any part of the project and understand what's going on, feel like it was you who wrote it, because the language is simple and there's only one way of doing things.
Rust's compilation speed is slow, on par with C++. V compiles 1.2 million lines of code per cpu per second.
Since V's domain is close to both Go and Rust, I decided to use a simple example to compare the three.
It's a simple program that fetches top Hacker News stories concurrently. (Note, that all examples only use stdlib, no external libs.)
Rust
use serde::Deserialize; use std::sync::{Arc, Mutex}; const STORIES_URL: &str = "https://hacker-news.firebaseio.com/v0/topstories.json"; const ITEM_URL_BASE: &str = "https://hacker-news.firebaseio.com/v0/item"; #[derive(Deserialize)] struct Story { title: String, } fn main() { let story_ids: Arc<Vec<u64>> = Arc::new(reqwest::get(STORIES_URL).unwrap().json().unwrap()); let cursor = Arc::new(Mutex::new(0)); let mut handles = Vec::new(); for _ in 0..8 { let cursor = cursor.clone(); let story_ids = story_ids.clone(); handles.push(std::thread::spawn(move || loop { let index = { let mut cursor_guard = cursor.lock().unwrap(); let index = *cursor_guard; if index >= story_ids.len() { return; } *cursor_guard += 1; index }; let story_url = format!("{}/{}.json", ITEM_URL_BASE, story_ids[index]); let story: Story = reqwest::get(&story_url).unwrap().json().unwrap(); println!("{}", story.title); })); } for handle in handles { handle.join().unwrap(); } }Go
package main import ( "encoding/json" "fmt" "io/ioutil" "net/http" "sync" ) const STORIES_URL = "https://hacker-news.firebaseio.com/v0/topstories.json" const ITEM_URL_BASE = "https://hacker-news.firebaseio.com/v0/item" type Story struct { Title string } func main() { rsp, err := http.Get(STORIES_URL) if err != nil { panic(err) } defer rsp.Body.Close() data, err := ioutil.ReadAll(rsp.Body) if err != nil { panic(err) } var ids []int if err := json.Unmarshal(data, &ids); err != nil { panic(err) } var cursor int var mutex sync.Mutex next := func() int { mutex.Lock() defer mutex.Unlock() temp := cursor cursor++ return temp } wg := sync.WaitGroup{} for i := 0; i < 8; i++ { wg.Add(1) go func() { for cursor := next(); cursor < len(ids); cursor = next() { url := fmt.Sprintf( "%s/%d.json", ITEM_URL_BASE, ids[cursor], ) rsp, err := http.Get(url) if err != nil { panic(err) } defer rsp.Body.Close() data, err := ioutil.ReadAll(rsp.Body) if err != nil { panic(err) } var story Story if err := json.Unmarshal(data, &story); err != nil { panic(err) } fmt.Println(story.Title) } wg.Done() }() } wg.Wait() }V
import net.http import json const ( stories_url = 'https://hacker-news.firebaseio.com/v0/topstories.json' item_base_url = 'https://hacker-news.firebaseio.com/v0/item' ) struct Story { title string } struct Cursor { mut: pos int } fn main() { resp := http.get(stories_url)! ids := json.decode([]int, resp.body)! shared cursor := Cursor{} mut threads := []thread{} for _ in 0 .. 8 { threads << go fn (ids []int, shared cursor Cursor) { for { id := lock cursor { if cursor.pos >= ids.len { break } cursor.pos++ ids[cursor.pos - 1] } resp := http.get('$item_base_url/${id}.json') or { panic(err) } story := json.decode(Story, resp.body) or { panic(err) } println(story.title) } }(ids, shared cursor) } threads.wait() }Nim
V and Nim are very different. One of V's main philosophies is "there must be only one way of doing things". This results in predictable, simple, and maintainable code.
Nim gives a lot of options and freedom to developers. For example, in V you would write
foo.bar_baz()
,
but in Nim all of these are valid: foo.barBaz()
, foo.bar_baz()
,
bar_baz(foo)
, barBaz(foo)
,
barbaz(foo)
etc.
In V there's only one way to return a value from a function: return value
. In Nim you can do
return value
, result = value
, value
(final expression), or modify
a ref
argument.
Features like macros and OOP offer multiple ways to solve problems and increase complexity.
Nim's strings are mutable, in my opinion this is a huge drawback. I'll post a detailed article about the power of immutable strings.
Unlike V, Nim generates unreadable C code with lots of extra bloat. For example:
var users = [ User(name: "Carl", last_name: "Black", age: 22), User(name: "Sam", last_name: "Johnson", age: 23) ]If we build this with
nim c -d:release test.nim
, we get
STRING_LITERAL(TM_R8RUzYq41iOx0I9bZH5Nyrw_5, "Carl", 4); STRING_LITERAL(TM_R8RUzYq41iOx0I9bZH5Nyrw_6, "Black", 5); STRING_LITERAL(TM_R8RUzYq41iOx0I9bZH5Nyrw_7, "Sam", 3); STRING_LITERAL(TM_R8RUzYq41iOx0I9bZH5Nyrw_8, "Johnson", 7); NIM_CONST tyArray_m9aGbgPB3gZgFcKcDkjg9a8g TM_R8RUzYq41iOx0I9bZH5Nyrw_4 = { { ((NimStringDesc*) &TM_R8RUzYq41iOx0I9bZH5Nyrw_5), ((NimStringDesc*) &TM_R8RUzY q41iOx0I9bZH5Nyrw_6), ((NI) 22)}, {((NimStringDesc*) &TM_R8RUzYq41iOx0I9bZH5Nyrw_7), ((NimStringDesc*) &TM_R8RUzYq41iOx0I9bZH5Nyrw_8), ((NI) 23)}} ; N_LIB_PRIVATE N_NIMCALL(void, NimMainModule)(void) { { TFrame FR_; FR_.len = 0; } nimRegisterGlobalMarker(TM_R8RUzYq41iOx0I9bZH5Nyrw_3); genericAssign((void*)users_oOczRkVOc3qtKT8rsAJzaw, (void*)TM_R8RUzYq41iOx0I9bZH5Nyrw_4, (&NTI_m9aGbgPB3gZgFcKcDkjg9a8g_)); } N_LIB_PRIVATE N_NIMCALL(void, aDatInit000)(void) { static TNimNode* TM_R8RUzYq41iOx0I9bZH5Nyrw_2[3]; static TNimNode TM_R8RUzYq41iOx0I9bZH5Nyrw_0[4]; NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_.size = sizeof(tyObject_User_Qp0mfNOzxWdmSSWLHA9cnZQ); NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_.kind = 18; NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_.base = 0; NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_.flags = 2; TM_R8RUzYq41iOx0I9bZH5Nyrw_2[0] = &TM_R8RUzYq41iOx0I9bZH5Nyrw_0[1]; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[1].kind = 1; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[1].offset = offsetof(tyObject_User_Qp0mfNOzxWdmSSWLHA9cnZQ, name); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[1].typ = (&NTI_77mFvmsOLKik79ci2hXkHEg_); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[1].name = "name"; TM_R8RUzYq41iOx0I9bZH5Nyrw_2[1] = &TM_R8RUzYq41iOx0I9bZH5Nyrw_0[2]; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[2].kind = 1; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[2].offset = offsetof(tyObject_User_Qp0mfNOzxWdmSSWLHA9cnZQ, last_name); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[2].typ = (&NTI_77mFvmsOLKik79ci2hXkHEg_); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[2].name = "last_name"; TM_R8RUzYq41iOx0I9bZH5Nyrw_2[2] = &TM_R8RUzYq41iOx0I9bZH5Nyrw_0[3]; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[3].kind = 1; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[3].offset = offsetof(tyObject_User_Qp0mfNOzxWdmSSWLHA9cnZQ, age); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[3].typ = (&NTI_rR5Bzr1D5krxoo1NcNyeMA_); TM_R8RUzYq41iOx0I9bZH5Nyrw_0[3].name = "age"; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[0].len = 3; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[0].kind = 2; TM_R8RUzYq41iOx0I9bZH5Nyrw_0[0].sons = &TM_R8RUzYq41iOx0I9bZH5Nyrw_2[0] ; NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_.node = &TM_R8RUzYq41iOx0I9bZH5Nyrw_0[0]; NTI_m9aGbgPB3gZgFcKcDkjg9a8g_.size = sizeof(tyArray_m9aGbgPB3gZgFcKcDkjg9a8g); NTI_m9aGbgPB3gZgFcKcDkjg9a8g_.kind = 16; NTI_m9aGbgPB3gZgFcKcDkjg9a8g_.base = (&NTI_Qp0mfNOzxWdmSSWLHA9cnZQ_); NTI_m9aGbgPB3gZgFcKcDkjg9a8g_.flags = 2; }
V can emit native code directly (V's native backend is not as complete as the C backend yet though), Nim can only emit C and JavaScript. It's also possible to embed C code in Nim, which reduces safety and portability.
Nim allows importing functions into global namespace. This becomes a huge problem when working on large
code bases. Explicit imports that V, Go, Oberon have are much more practical: pkg.function()
vs
function()
.
V's syntax is cleaner with fewer rules. Lack of significant whitespace improves readability and maintainability of large code bases and makes generating code much easier. From my experience of working with a huge Python code base, moving large blocks of code in whitespace sensitive languages is scary.
The list can go on and on. Nim is a language with a lot of features, still developing and changing. V is not going to change much, if at all.
Again, I'm not saying it's a worse language. It's a very different language that offers a lot of options and features. Many developers prefer this approach. And that's ok.