Browse Source

Modules (#17)

* chore: rename program -> module

* feat: add module builder

* feat: append standard library

* chore: fix clippy warnings

* chore: fix formatting

* feat: imports

* chore: fix formatting

* feat: resolve path deltas to entrypoint

* chore: fix formatting

* fix: path resolver

* chore: refactor stdlib

* docs: document modules

* docs: add "unreleased" section to changelog

* docs: add modules as unreleased bullet in changelog

* feat: resolve nested modules

* fix: clean up file resolvement

* chore: fix clippy lints
master
Garrit Franke 3 years ago committed by GitHub
parent
commit
7810052c56
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 1
      CHANGELOG.md
  2. 3
      docs/SUMMARY.md
  3. 66
      docs/modules/SUMMARY.md
  4. 27
      lib/array.sb
  5. 6
      lib/assert.sb
  6. 10
      lib/io.sb
  7. 4
      lib/os.sb
  8. 50
      lib/stdio.sb
  9. 4
      lib/stdlib.sb
  10. 132
      src/builder/mod.rs
  11. 41
      src/command/build.rs
  12. 2
      src/generator/c.rs
  13. 2
      src/generator/js.rs
  14. 6
      src/generator/llvm.rs
  15. 4
      src/generator/mod.rs
  16. 10
      src/generator/x86.rs
  17. 2
      src/lexer/mod.rs
  18. 1
      src/main.rs
  19. 2
      src/parser/infer.rs
  20. 6
      src/parser/mod.rs
  21. 17
      src/parser/node_type.rs
  22. 8
      src/parser/parser.rs
  23. 37
      src/parser/rules.rs
  24. 106
      src/parser/tests.rs
  25. 5
      src/tests/test_examples.rs
  26. 5
      tests/importable_module/foo/bar.sb
  27. 3
      tests/importable_module/foo/baz/module.sb
  28. 6
      tests/importable_module/module.sb
  29. 5
      tests/imports.sb

1
CHANGELOG.md

@ -5,6 +5,7 @@
**Features** **Features**
- Match statements (#15) - Match statements (#15)
- Modules and imports (#17)
## v0.4.0 (2021-02-20) ## v0.4.0 (2021-02-20)

3
docs/SUMMARY.md

@ -9,7 +9,8 @@
- [Functions](./concepts/functions.md) - [Functions](./concepts/functions.md)
- [Comments](./concepts/comments.md) - [Comments](./concepts/comments.md)
- [Control Flow](./concepts/control-flow.md) - [Control Flow](./concepts/control-flow.md)
- [Structured data](./concepts/structured-data.md) - [Structured Data](./concepts/structured-data.md)
- [Modules and Imports](./modules/SUMMARY.md)
- [Developer Resources](./developers/SUMMARY.md) - [Developer Resources](./developers/SUMMARY.md)
- [Contributing to Sabre](./developers/contributing.md) - [Contributing to Sabre](./developers/contributing.md)
- [Compiler Backends](./developers/backends.md) - [Compiler Backends](./developers/backends.md)

66
docs/modules/SUMMARY.md

@ -0,0 +1,66 @@
# Modules and Imports
Projects naturally grow over time, and digging through 10.000 lines of code in a single file can be cumbersome. By grouping related functionality and separating code with distinct features, you’ll clarify where to find code that implements a particular feature and where to go to change how a feature works.
The programs we've written so far have been in one file. As a project grows, you can organize code by splitting it into multiple modules with a clear name.
In Sabre, every file is also a module. Let's take a look at a project structure and identify its modules.
```
.
├── foo
   ├── bar.sb
   └── baz
   └── module.sb
├── main.sb
└── some_logic.sb
```
As per convention, the entrypoint for this project is the `main.sb` file in the root directory.
There is a child-module called `some_logic` at the same directory-level.
Below it, there is a directory called `foo`, containing the submodule `bar`. To address the `bar` module from our entrypoint, we'd import the following:
```
import "foo/bar"
```
> **Note**: File extensions in imports are optional. Importing `foo/bar.sb` would yield the same result as importing `foo/bar`.
## Module entrypoints
In the `foo` directory, there is another directory called `baz`, containing a single file named `module.sb`. This file is treated as a special file, since it serves as the entrypoint for that module. So, instead of importing the file explicitely:
```
// main.sb
import "foo/baz/module"
```
we can simply import the module containing this file, and Sabre will import the contained `module.sb` instead.
```
// main.sb
import "foo/baz"
```
## Using imported modules
To use code defined in a separate module, we first need to import it. This is usually done at the top of the file, but it technically doesn't make a difference where in the document the import is defined. Once the module is imported, we can use the code inside it, as if it were in the current file.
Let's say we have a module named `math.sb` in the same directory as out `main.sb`, and it defines the function `add(x: int, y: int): int`. To call it in our `main.sb`, we'd do the following:
```
import "math"
fn main() {
println(add(1, 2))
}
```
If we run `main.sb`, we should see the expected output. Sabre has imported the `add` function from the `math` module.
```
$ sabre run main.sb
3
```

27
lib/array.sb

@ -0,0 +1,27 @@
// Prints the size of an array
fn len(arr: int[]): int {
let c: int = 0
while arr[c] {
c += 1
}
return c
}
// Reverses an array
// TODO: fix me!
fn rev(arr: int[]): int[] {
let l: int = len(arr)
let new_arr: int[] = []
let i: int = 0
let j: int = l
while i < l {
new_arr[i] = arr[j]
i = i - 1
j = j - 1
}
return new_arr
}

6
lib/assert.sb

@ -0,0 +1,6 @@
fn assert(condition: bool) {
if condition == false {
println("Assertion failed")
exit(1)
}
}

10
lib/io.sb

@ -0,0 +1,10 @@
// Raw wrapper around _printf builtin function.
// Writes the given content to stdout
fn print(arg: string) {
_printf(arg)
}
// Like print(), but with an extra newline ('\n') character
fn println(msg: string) {
print(msg + "\n")
}

4
lib/os.sb

@ -0,0 +1,4 @@
// Exit the program immediately
fn exit(code: int) {
_exit(code)
}

50
lib/stdio.sb

@ -1,50 +0,0 @@
// Raw wrapper around _printf builtin function.
// Writes the given content to stdout
fn print(arg: string) {
_printf(arg)
}
// Like print(), but with an extra newline ('\n') character
fn println(msg: string) {
print(msg + "\n")
}
// Exit the program immediately
fn exit(code: int) {
_exit(code)
}
fn assert(condition: bool) {
if condition == false {
println("Assertion failed")
exit(1)
}
}
// Prints the size of an array
fn len(arr: int[]): int {
let c: int = 0
while arr[c] {
c += 1
}
return c
}
// Reverses an array
// TODO: fix me!
fn rev(arr: int[]): int[] {
let l: int = len(arr)
let new_arr: int[] = []
let i: int = 0
let j: int = l
while i < l {
new_arr[i] = arr[j]
i = i - 1
j = j - 1
}
return new_arr
}

4
lib/stdlib.sb

@ -0,0 +1,4 @@
import "array"
import "assert"
import "io"
import "os"

132
src/builder/mod.rs

@ -0,0 +1,132 @@
use crate::generator;
use crate::lexer;
use crate::parser;
use crate::Lib;
use crate::PathBuf;
use parser::node_type::Module;
use std::env;
/**
* Copyright 2021 Garrit Franke
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
use std::fs::File;
use std::io::Read;
use std::io::Write;
pub struct Builder {
in_file: PathBuf,
modules: Vec<Module>,
}
impl Builder {
pub fn new(entrypoint: PathBuf) -> Self {
Self {
in_file: entrypoint,
modules: Vec::new(),
}
}
fn get_base_path(&self) -> Result<PathBuf, String> {
Ok(self
.in_file
.parent()
.ok_or("File does not have a parent")?
.to_path_buf())
}
pub fn build(&mut self) -> Result<(), String> {
let in_file = self.in_file.clone();
// Resolve path deltas between working directory and entrypoint
let base_directory = self.get_base_path()?;
if let Ok(resolved_delta) = in_file.strip_prefix(&base_directory) {
// TODO: This error could probably be handled better
let _ = env::set_current_dir(base_directory);
self.in_file = resolved_delta.to_path_buf();
}
self.build_module(self.in_file.clone())?;
// Append standard library
self.build_stdlib();
Ok(())
}
fn build_module(&mut self, file_path: PathBuf) -> Result<Module, String> {
// TODO: This method can probably cleaned up quite a bit
// In case the module is a directory, we have to append the filename of the entrypoint
let resolved_file_path = if file_path.is_dir() {
file_path.join("module.sb")
} else {
file_path
};
let mut file = File::open(&resolved_file_path)
.map_err(|_| format!("Could not open file: {}", resolved_file_path.display()))?;
let mut contents = String::new();
file.read_to_string(&mut contents)
.expect("Could not read file");
let tokens = lexer::tokenize(&contents);
let module = parser::parse(
tokens,
Some(contents),
resolved_file_path.display().to_string(),
)?;
for import in &module.imports {
// Build module relative to the current file
let mut import_path = resolved_file_path
.parent()
.unwrap()
.join(PathBuf::from(import));
if import_path.is_dir() {
import_path = import_path.join("module.sb");
} else if !import_path.ends_with(".sb") {
import_path.set_extension("sb");
}
self.build_module(import_path)?;
}
self.modules.push(module.clone());
Ok(module)
}
pub(crate) fn generate(&mut self, out_file: PathBuf) -> Result<(), String> {
let mut mod_iter = self.modules.iter();
// TODO: We shouldn't clone here
let mut condensed = mod_iter.next().ok_or("No module specified")?.clone();
for module in mod_iter {
condensed.merge_with(module.clone());
}
let output = generator::generate(condensed);
let mut file = std::fs::File::create(out_file).expect("create failed");
file.write_all(output.as_bytes()).expect("write failed");
file.flush().map_err(|_| "Could not flush file".into())
}
fn build_stdlib(&mut self) {
let assets = Lib::iter();
for file in assets {
let stdlib_raw =
Lib::get(&file).expect("Standard library not found. This should not occur.");
let stblib_str =
std::str::from_utf8(&stdlib_raw).expect("Could not interpret standard library.");
let stdlib_tokens = lexer::tokenize(&stblib_str);
let module = parser::parse(stdlib_tokens, Some(stblib_str.into()), file.to_string())
.expect("Could not parse stdlib");
self.modules.push(module);
}
}
}

41
src/command/build.rs

@ -13,44 +13,11 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
use crate::generator; use crate::builder;
use crate::lexer;
use crate::parser;
use crate::Lib;
use std::fs::File;
use std::io::Read;
use std::io::Write;
use std::path::Path; use std::path::Path;
pub fn build(in_file: &Path, out_file: &Path) -> Result<(), String> { pub fn build(in_file: &Path, out_file: &Path) -> Result<(), String> {
let mut file = File::open(in_file).expect("Could not open file"); let mut b = builder::Builder::new(in_file.to_path_buf());
let mut contents = String::new(); b.build()?;
b.generate(out_file.to_path_buf())
file.read_to_string(&mut contents)
.expect("Could not read file");
let tokens = lexer::tokenize(&contents);
let mut program = parser::parse(tokens, Some(contents))?;
// C Backend currently does not support stdlib yet, since not all features are implemented
if cfg!(feature = "backend_node") {
let stdlib = build_stdlib();
program.merge_with(stdlib);
}
let output = generator::generate(program);
let mut file = std::fs::File::create(out_file).expect("create failed");
file.write_all(output.as_bytes()).expect("write failed");
file.flush().expect("Could not flush file");
Ok(())
}
fn build_stdlib() -> parser::node_type::Program {
let stdlib_raw =
Lib::get("stdio.sb").expect("Standard library not found. This should not occur.");
let stblib_str =
std::str::from_utf8(&stdlib_raw).expect("Could not interpret standard library.");
let stdlib_tokens = lexer::tokenize(&stblib_str);
parser::parse(stdlib_tokens, Some(stblib_str.into())).expect("Could not parse stdlib")
} }

2
src/generator/c.rs

@ -20,7 +20,7 @@ use crate::util::Either;
pub struct CGenerator; pub struct CGenerator;
impl Generator for CGenerator { impl Generator for CGenerator {
fn generate(prog: Program) -> String { fn generate(prog: Module) -> String {
let mut code = String::new(); let mut code = String::new();
let raw_builtins = let raw_builtins =

2
src/generator/js.rs

@ -20,7 +20,7 @@ use std::collections::HashMap;
pub struct JsGenerator; pub struct JsGenerator;
impl Generator for JsGenerator { impl Generator for JsGenerator {
fn generate(prog: Program) -> String { fn generate(prog: Module) -> String {
let mut code = String::new(); let mut code = String::new();
let raw_builtins = let raw_builtins =

6
src/generator/llvm.rs

@ -16,16 +16,16 @@
use crate::generator::Generator; use crate::generator::Generator;
use crate::parser::node_type::*; use crate::parser::node_type::*;
use inkwell::context::Context; use inkwell::context::Context;
use inkwell::module::Module; use inkwell::module;
use inkwell::types::*; use inkwell::types::*;
pub struct LLVMGenerator<'ctx> { pub struct LLVMGenerator<'ctx> {
ctx: &'ctx Context, ctx: &'ctx Context,
module: Module<'ctx>, module: module::Module<'ctx>,
} }
impl<'ctx> Generator for LLVMGenerator<'ctx> { impl<'ctx> Generator for LLVMGenerator<'ctx> {
fn generate(prog: Program) -> String { fn generate(prog: Module) -> String {
let ctx = Context::create(); let ctx = Context::create();
let module = ctx.create_module("main"); let module = ctx.create_module("main");
let mut generator = LLVMGenerator { ctx: &ctx, module }; let mut generator = LLVMGenerator { ctx: &ctx, module };

4
src/generator/mod.rs

@ -26,13 +26,13 @@ mod tests;
pub mod x86; pub mod x86;
pub trait Generator { pub trait Generator {
fn generate(prog: Program) -> String; fn generate(prog: Module) -> String;
} }
// Since we're using multiple features, // Since we're using multiple features,
// "unreachable" statements are okay // "unreachable" statements are okay
#[allow(unreachable_code)] #[allow(unreachable_code)]
pub fn generate(prog: Program) -> String { pub fn generate(prog: Module) -> String {
#[cfg(feature = "backend_llvm")] #[cfg(feature = "backend_llvm")]
return llvm::LLVMGenerator::generate(prog); return llvm::LLVMGenerator::generate(prog);
#[cfg(feature = "backend_c")] #[cfg(feature = "backend_c")]

10
src/generator/x86.rs

@ -14,7 +14,7 @@
* limitations under the License. * limitations under the License.
*/ */
use crate::generator::Generator; use crate::generator::Generator;
use crate::parser::node_type::{Function, Program, Statement}; use crate::parser::node_type::{Function, Module, Statement};
struct Assembly { struct Assembly {
asm: Vec<String>, asm: Vec<String>,
@ -45,7 +45,7 @@ impl Assembly {
pub struct X86Generator; pub struct X86Generator;
impl Generator for X86Generator { impl Generator for X86Generator {
fn generate(prog: Program) -> String { fn generate(prog: Module) -> String {
Self::new().gen_program(prog).build() Self::new().gen_program(prog).build()
} }
} }
@ -55,12 +55,14 @@ impl X86Generator {
X86Generator {} X86Generator {}
} }
fn gen_program(&mut self, prog: Program) -> Assembly { fn gen_program(&mut self, prog: Module) -> Assembly {
let mut asm = Assembly::new(); let mut asm = Assembly::new();
let Program { let Module {
func, func,
globals, globals,
structs: _, structs: _,
path: _,
imports: _,
} = prog; } = prog;
asm.add(".intel_syntax noprefix"); asm.add(".intel_syntax noprefix");

2
src/lexer/mod.rs

@ -148,6 +148,7 @@ pub enum Keyword {
Struct, Struct,
New, New,
Match, Match,
Import,
Unknown, Unknown,
} }
@ -376,6 +377,7 @@ impl Cursor<'_> {
c if c == "struct" => Keyword::Struct, c if c == "struct" => Keyword::Struct,
c if c == "new" => Keyword::New, c if c == "new" => Keyword::New,
c if c == "match" => Keyword::Match, c if c == "match" => Keyword::Match,
c if c == "import" => Keyword::Import,
_ => Keyword::Unknown, _ => Keyword::Unknown,
} }
} }

1
src/main.rs

@ -22,6 +22,7 @@ extern crate tempfile;
use std::path::PathBuf; use std::path::PathBuf;
use structopt::StructOpt; use structopt::StructOpt;
mod builder;
mod command; mod command;
mod generator; mod generator;
mod lexer; mod lexer;

2
src/parser/infer.rs

@ -19,7 +19,7 @@ use super::node_type::*;
/// ///
/// TODO: Global symbol table is passed around randomly. /// TODO: Global symbol table is passed around randomly.
/// This could probably be cleaned up. /// This could probably be cleaned up.
pub(super) fn infer(program: &mut Program) { pub(super) fn infer(program: &mut Module) {
let table = &program.get_symbol_table(); let table = &program.get_symbol_table();
// TODO: Fix aweful nesting // TODO: Fix aweful nesting
for func in &mut program.func { for func in &mut program.func {

6
src/parser/mod.rs

@ -20,11 +20,11 @@ pub mod node_type;
mod parser; mod parser;
mod rules; mod rules;
use crate::lexer::Token; use crate::lexer::Token;
use node_type::Program; use node_type::Module;
#[cfg(test)] #[cfg(test)]
mod tests; mod tests;
pub fn parse(tokens: Vec<Token>, raw: Option<String>) -> Result<Program, String> { pub fn parse(tokens: Vec<Token>, raw: Option<String>, path: String) -> Result<Module, String> {
let mut parser = parser::Parser::new(tokens, raw); let mut parser = parser::Parser::new(tokens, raw, path);
parser.parse() parser.parse()
} }

17
src/parser/node_type.rs

@ -1,3 +1,6 @@
use crate::lexer::*;
use core::convert::TryFrom;
use std::collections::HashMap;
/** /**
* Copyright 2020 Garrit Franke * Copyright 2020 Garrit Franke
* *
@ -13,22 +16,22 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
use crate::lexer::*; use std::collections::HashSet;
use core::convert::TryFrom;
use std::collections::HashMap;
/// Table that contains all symbol and its types /// Table that contains all symbol and its types
pub type SymbolTable = HashMap<String, Option<Type>>; pub type SymbolTable = HashMap<String, Option<Type>>;
#[derive(Debug)] #[derive(Debug, Clone)]
pub struct Program { pub struct Module {
pub path: String,
pub imports: HashSet<String>,
pub func: Vec<Function>, pub func: Vec<Function>,
pub structs: Vec<StructDef>, pub structs: Vec<StructDef>,
pub globals: Vec<String>, pub globals: Vec<String>,
} }
impl Program { impl Module {
pub fn merge_with(&mut self, mut other: Program) { pub fn merge_with(&mut self, mut other: Module) {
self.func.append(&mut other.func); self.func.append(&mut other.func);
self.globals.append(&mut other.globals) self.globals.append(&mut other.globals)
} }

8
src/parser/parser.rs

@ -24,6 +24,7 @@ use std::iter::Peekable;
use std::vec::IntoIter; use std::vec::IntoIter;
pub struct Parser { pub struct Parser {
pub path: String,
tokens: Peekable<IntoIter<Token>>, tokens: Peekable<IntoIter<Token>>,
peeked: Vec<Token>, peeked: Vec<Token>,
current: Option<Token>, current: Option<Token>,
@ -32,12 +33,13 @@ pub struct Parser {
} }
impl Parser { impl Parser {
pub fn new(tokens: Vec<Token>, raw: Option<String>) -> Parser { pub fn new(tokens: Vec<Token>, raw: Option<String>, file_name: String) -> Parser {
let tokens_without_whitespace: Vec<Token> = tokens let tokens_without_whitespace: Vec<Token> = tokens
.into_iter() .into_iter()
.filter(|token| token.kind != TokenKind::Whitespace && token.kind != TokenKind::Comment) .filter(|token| token.kind != TokenKind::Whitespace && token.kind != TokenKind::Comment)
.collect(); .collect();
Parser { Parser {
path: file_name,
tokens: tokens_without_whitespace.into_iter().peekable(), tokens: tokens_without_whitespace.into_iter().peekable(),
peeked: vec![], peeked: vec![],
current: None, current: None,
@ -46,8 +48,8 @@ impl Parser {
} }
} }
pub fn parse(&mut self) -> Result<Program, String> { pub fn parse(&mut self) -> Result<Module, String> {
let mut program = self.parse_program()?; let mut program = self.parse_module()?;
// infer types // infer types
infer(&mut program); infer(&mut program);

37
src/parser/rules.rs

@ -1,3 +1,9 @@
use super::node_type::Statement;
use super::node_type::*;
use super::parser::Parser;
use crate::lexer::Keyword;
use crate::lexer::{TokenKind, Value};
use std::collections::HashMap;
/** /**
* Copyright 2020 Garrit Franke * Copyright 2020 Garrit Franke
* *
@ -13,24 +19,23 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
use super::node_type::Statement; use std::collections::HashSet;
use super::node_type::*;
use super::parser::Parser;
use crate::lexer::Keyword;
use crate::lexer::{TokenKind, Value};
use std::collections::HashMap;
use std::convert::TryFrom; use std::convert::TryFrom;
impl Parser { impl Parser {
pub fn parse_program(&mut self) -> Result<Program, String> { pub fn parse_module(&mut self) -> Result<Module, String> {
let mut functions = Vec::new(); let mut functions = Vec::new();
let mut structs = Vec::new(); let mut structs = Vec::new();
let mut imports = HashSet::new();
let globals = Vec::new(); let globals = Vec::new();
while self.has_more() { while self.has_more() {
let next = self.peek()?; let next = self.peek()?;
match next.kind { match next.kind {
TokenKind::Keyword(Keyword::Function) => functions.push(self.parse_function()?), TokenKind::Keyword(Keyword::Function) => functions.push(self.parse_function()?),
TokenKind::Keyword(Keyword::Import) => {
imports.insert(self.parse_import()?);
}
TokenKind::Keyword(Keyword::Struct) => { TokenKind::Keyword(Keyword::Struct) => {
structs.push(self.parse_struct_definition()?) structs.push(self.parse_struct_definition()?)
} }
@ -38,10 +43,14 @@ impl Parser {
} }
} }
Ok(Program { // TODO: Populate imports
Ok(Module {
func: functions, func: functions,
structs, structs,
globals, globals,
path: self.path.clone(),
imports,
}) })
} }
@ -139,6 +148,18 @@ impl Parser {
}) })
} }
fn parse_import(&mut self) -> Result<String, String> {
self.match_keyword(Keyword::Import)?;
let import_path_token = self.match_token(TokenKind::Literal(Value::Str))?;
// Remove leading and trailing string tokens
let mut chars = import_path_token.raw.chars();
chars.next();
chars.next_back();
Ok(chars.collect())
}
fn parse_type(&mut self) -> Result<Type, String> { fn parse_type(&mut self) -> Result<Type, String> {
self.match_token(TokenKind::Colon)?; self.match_token(TokenKind::Colon)?;
let next = self.peek()?; let next = self.peek()?;

106
src/parser/tests.rs

@ -21,7 +21,7 @@ use crate::parser::parse;
fn test_parse_empty_function() { fn test_parse_empty_function() {
let raw = "fn main() {}"; let raw = "fn main() {}";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -33,7 +33,7 @@ fn test_parse_function_with_return() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -45,7 +45,7 @@ fn test_parse_redundant_semicolon() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_err()) assert!(tree.is_err())
} }
@ -55,7 +55,7 @@ fn test_parse_no_function_context() {
let x = 1 let x = 1
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_err()) assert!(tree.is_err())
} }
@ -73,7 +73,7 @@ fn test_parse_multiple_functions() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -86,7 +86,7 @@ fn test_parse_variable_declaration() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -100,7 +100,7 @@ fn test_parse_variable_reassignment() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -114,7 +114,7 @@ fn test_parse_variable_declaration_added() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -126,7 +126,7 @@ fn test_parse_function_with_args() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -142,7 +142,7 @@ fn test_parse_function_call() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -158,7 +158,7 @@ fn test_parse_return_function_call() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -174,7 +174,7 @@ fn test_parse_function_call_multiple_arguments() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -190,7 +190,7 @@ fn test_parse_nexted_function_call() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -202,7 +202,7 @@ fn test_parse_basic_ops() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -214,7 +214,7 @@ fn test_parse_compound_ops() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -226,7 +226,7 @@ fn test_parse_compound_ops_with_function_call() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -238,7 +238,7 @@ fn test_parse_compound_ops_with_strings() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -250,7 +250,7 @@ fn test_parse_compound_ops_with_identifier() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -262,7 +262,7 @@ fn test_parse_compound_ops_with_identifier_first() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -274,7 +274,7 @@ fn test_parse_compound_ops_return() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -288,7 +288,7 @@ fn test_parse_basic_conditional() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -303,7 +303,7 @@ fn test_parse_basic_conditional_with_multiple_statements() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -320,7 +320,7 @@ fn test_parse_conditional_else_if_branch() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -339,7 +339,7 @@ fn test_parse_conditional_multiple_else_if_branch_branches() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -356,7 +356,7 @@ fn test_parse_conditional_else_branch() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -377,7 +377,7 @@ fn test_parse_conditional_elseif_else_branch() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -390,7 +390,7 @@ fn test_int_array() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -402,7 +402,7 @@ fn test_string_array() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -417,7 +417,7 @@ fn test_basic_while_loop() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -432,7 +432,7 @@ fn test_while_loop_boolean_expression() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -449,7 +449,7 @@ fn test_boolean_arithmetic() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -468,7 +468,7 @@ fn test_array_access_in_loop() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -482,7 +482,7 @@ fn test_array_access_standalone() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -497,7 +497,7 @@ fn test_array_access_assignment() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -513,7 +513,7 @@ fn test_array_access_in_if() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -526,7 +526,7 @@ fn test_uninitialized_variables() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -538,7 +538,7 @@ fn test_function_call_math() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -550,7 +550,7 @@ fn test_function_multiple_args() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -562,7 +562,7 @@ fn test_array_position_assignment() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -576,7 +576,7 @@ fn test_typed_declare() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()) assert!(tree.is_ok())
} }
@ -588,7 +588,7 @@ fn test_no_function_args_without_type() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_err()) assert!(tree.is_err())
} }
@ -600,7 +600,7 @@ fn test_function_with_return_type() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
assert_eq!(tree.unwrap().func[0].ret_type, Some(Type::Int)); assert_eq!(tree.unwrap().func[0].ret_type, Some(Type::Int));
} }
@ -617,7 +617,7 @@ fn test_booleans_in_function_call() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -637,7 +637,7 @@ fn test_late_initializing_variable() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -653,7 +653,7 @@ fn test_simple_for_loop() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -671,7 +671,7 @@ fn test_nested_for_loop() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -689,7 +689,7 @@ fn test_nested_array() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -702,7 +702,7 @@ fn test_simple_nested_expression() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -722,7 +722,7 @@ fn test_continue() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -742,7 +742,7 @@ fn test_break() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -761,7 +761,7 @@ fn test_complex_nested_expressions() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -773,7 +773,7 @@ fn test_array_as_argument() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }
@ -795,6 +795,6 @@ fn test_struct_initialization() {
} }
"; ";
let tokens = tokenize(raw); let tokens = tokenize(raw);
let tree = parse(tokens, Some(raw.to_string())); let tree = parse(tokens, Some(raw.to_string()), "".into());
assert!(tree.is_ok()); assert!(tree.is_ok());
} }

5
src/tests/test_examples.rs

@ -36,6 +36,11 @@ fn test_directory(dir_in: &str) -> Result<(), Error> {
for ex in examples { for ex in examples {
let example = ex?; let example = ex?;
let in_file = dir.join(dir_in).join(example.file_name()); let in_file = dir.join(dir_in).join(example.file_name());
// We don't want to build submodules, since they don't run without a main function
if in_file.is_dir() {
continue;
}
let out_file = dir.join(&dir_out).join( let out_file = dir.join(&dir_out).join(
example example
.file_name() .file_name()

5
tests/importable_module/foo/bar.sb

@ -0,0 +1,5 @@
import "baz"
fn nested_module() {
println("A deeply nested function was called!")
}

3
tests/importable_module/foo/baz/module.sb

@ -0,0 +1,3 @@
fn baz() {
println("Baz was called")
}

6
tests/importable_module/module.sb

@ -0,0 +1,6 @@
import "foo/bar"
fn external_function() {
println("I was called!!")
nested_module()
}

5
tests/imports.sb

@ -0,0 +1,5 @@
import "importable_module"
fn main() {
external_function()
}
Loading…
Cancel
Save