diff --git a/docs/SPEC.md b/docs/SPEC.md index e97f55bbd6..ad9fd4101a 100644 --- a/docs/SPEC.md +++ b/docs/SPEC.md @@ -96,7 +96,7 @@ The following keywords are reserved and may not be used as identifiers: and import not return option test empty in or package builtin -[IMPL#256](https://github.com/influxdata/platform/issues/256) Add in and empty operator support +[IMPL#256](https://github.com/influxdata/platform/issues/256) Add in and empty operator support #### Operators @@ -666,7 +666,7 @@ Duration and Time types are Timeable. ##### Stringable Constraint Stringable types can be evaluated and expressed in string interpolation. -String, Int, Uint, Float, Bool, Time, and Duration types are Stringable. +String, Int, Uint, Float, Bool, Time, and Duration types are Stringable. ### Blocks @@ -764,7 +764,7 @@ A primary expressions may be a literal, an identifier denoting a variable, or a #### Logical Operators -Flux provides the logical operators `and` and `or`. +Flux provides the logical operators `and` and `or`. Flux's logical operators observe the short-circuiting behavior seen in other programming languages, meaning that the right-hand side (RHS) operand is conditionally evaluated depending on the result of evaluating the left-hand side (LHS) operand. When the operator is `and`: @@ -1147,7 +1147,8 @@ All such values must have a corresponding builtin statement to declare the exist Function = "(" [Parameters] ")" "=>" MonoType . Properties = Property { "," Property } . - Property = identifier ":" MonoType . + Property = Label ":" MonoType . + Label = identifier | string_lit Parameters = Parameter { "," Parameter } . Parameter = [ "<-" | "?" ] identifier ":" MonoType . @@ -1291,8 +1292,8 @@ All calls to `systemTime` within a single evaluation of a Flux script return the `today()` returns the `now()` timestamp truncated to the day unit. Example: - - option now = 2021-01-01T12:43:21Z + + option now = 2021-01-01T12:43:21Z today() // Returns 2021-01-01T00:00:00Z ### FixedZone @@ -1510,7 +1511,7 @@ Example: #### Buckets -Buckets is a type of data source that retrieves a list of buckets that the caller is authorized to access. +Buckets is a type of data source that retrieves a list of buckets that the caller is authorized to access. It takes no input parameters and produces an output table with the following columns: | Name | Type | Description | @@ -1546,9 +1547,9 @@ Example: #### Fill -Fill will scan a stream for null values and replace them with a non-null value. +Fill will scan a stream for null values and replace them with a non-null value. -The output stream will be the same as the input stream, with all null values in the column replaced. +The output stream will be the same as the input stream, with all null values in the column replaced. Fill has the following properties: @@ -2501,9 +2502,9 @@ Duplicate duplicates a specified column in a table. If the specified column is not present in a table an error will be thrown. If the specified column is part of the group key, it will be duplicated, but it will not be part of the group key of the output table. If the column indicated by `as` does not exist, a column will be added to the table. -If the column does exist, that column will be overwritten with the values specified by `column`. +If the column does exist, that column will be overwritten with the values specified by `column`. If the `as` column is in the group key, there are two possible outcomes: -If the column indicated by `column` is in the group key, then `as` will remain in the group key and have the same group key value as `column`. +If the column indicated by `column` is in the group key, then `as` will remain in the group key and have the same group key value as `column`. If `column` is not part of the group key, then `as` is removed from the group key. Duplicate has the following properties: @@ -2619,10 +2620,10 @@ Or: |> group(columns: ["host", "_measurement"], mode: "by") ``` -Records are grouped by the `"host"` and `"_measurement"` columns. +Records are grouped by the `"host"` and `"_measurement"` columns. The resulting group key is `["host", "_measurement"]`, so a new table for every different `["host", "_measurement"]` -value is created. -Every table in the result contains every record for some `["host", "_measurement"]` value. +value is created. +Every table in the result contains every record for some `["host", "_measurement"]` value. Every record in some resulting table has the same value for the columns `"host"` and `"_measurement"`. _Except_ @@ -2633,7 +2634,7 @@ from(bucket: "telegraf/autogen") |> group(columns: ["_time"], mode: "except") ``` -Records are grouped by the set of all columns in the table, excluding `"_time"`. +Records are grouped by the set of all columns in the table, excluding `"_time"`. For example, if the table has columns `["_time", "host", "_measurement", "_field", "_value"]` then the group key would be `["host", "_measurement", "_field", "_value"]`. @@ -2645,7 +2646,7 @@ from(bucket: "telegraf/autogen") |> group() ``` -Records are grouped into a single table. +Records are grouped into a single table. The group key of the resulting table is empty. #### Columns @@ -2814,7 +2815,7 @@ window(intervals: intervals(every:1d, period:8h, offset:9h)) // window the data #### Pivot -Pivot collects values stored vertically (column-wise) in a table and aligns them horizontally (row-wise) into logical sets. +Pivot collects values stored vertically (column-wise) in a table and aligns them horizontally (row-wise) into logical sets. Pivot has the following properties: @@ -2875,7 +2876,7 @@ Output: | 1970-01-01T00:00:00.000000003Z | "m1" | null | null | null | 7.0 | | 1970-01-01T00:00:00.000000004Z | "m1" | null | null | 8.0 | null | -Example 2, align fields and measurements that have the same timestamp. +Example 2, align fields and measurements that have the same timestamp. Note the effect of: - having null values in some `columnKey` value; - having more values for the same `rowKey` and `columnKey` value (the 11th row overrides the 10th, and so does the 15th with the 14th). @@ -2933,8 +2934,8 @@ Both `tables` and `on` are required parameters. The `on` parameter and the `cross` method are mutually exclusive. Join currently only supports two input streams. -[IMPL#83](https://github.com/influxdata/flux/issues/83) Add support for joining more than 2 streams -[IMPL#84](https://github.com/influxdata/flux/issues/84) Add support for different join types +[IMPL#83](https://github.com/influxdata/flux/issues/83) Add support for joining more than 2 streams +[IMPL#84](https://github.com/influxdata/flux/issues/84) Add support for different join types Example: @@ -3131,7 +3132,7 @@ from(bucket: "telegraf/autogen") #### Difference -Difference computes the difference between subsequent records. +Difference computes the difference between subsequent records. Every user-specified column of numeric type is subtracted while others are kept intact. Difference has the following properties: @@ -3254,7 +3255,7 @@ Moving Average has the following properties: | Name | Type | Description | ---- | ---- | ----------- -| n | int | N specifies the number of points to mean. +| n | int | N specifies the number of points to mean. Rules for taking the moving average for numeric types: - the average over a period populated by `n` values is equal to their algebraic mean @@ -3301,8 +3302,8 @@ Timed Moving Average has the following properties: | Name | Type | Description | ---- | ---- | ----------- | every | duration | Every specifies the frequency of windows. -| period | duration | Period specifies the window size to mean. -| column | string | Column specifies a column to aggregate. Defaults to `"_value"` +| period | duration | Period specifies the window size to mean. +| column | string | Column specifies a column to aggregate. Defaults to `"_value"` Example: ``` @@ -3319,7 +3320,7 @@ It is a weighted moving average that gives more weighting to recent data as oppo | Name | Type | Description | ---- | ---- | ----------- -| n | int | N specifies the number of points to mean. +| n | int | N specifies the number of points to mean. Rules for taking the exponential moving average for numeric types: - the first value of an exponential moving average over `n` values is the algebraic mean of the first `n` values @@ -3510,7 +3511,7 @@ It acts on the `_value` column. Name | Type | Description | ---- | ---- | ----------- -| n | int | N specifies the sample size of the algorithm. +| n | int | N specifies the sample size of the algorithm. A triple exponential derivative is defined as `TRIX[i] = ((EMA3[i] / EMA3[i - 1]) - 1) * 100`, where - `EMA_3` is `EMA(EMA(EMA(data)))` @@ -3569,7 +3570,7 @@ over a specified time period to measure speed and change of data movements. It a Name | Type | Description | ---- | ---- | ----------- -| n | int | N specifies the sample size of the algorithm. +| n | int | N specifies the sample size of the algorithm. Rules for calculating the relative strength index for numeric types: - The general process of calculating is `RSI = 100 - (100 / (1 + (AVG GAIN / AVG LOSS)))` @@ -4031,7 +4032,7 @@ Example: #### Contains -Tests whether a value is a member of a set. +Tests whether a value is a member of a set. Contains has the following parameters: @@ -4046,7 +4047,7 @@ Example: #### Stream/table functions These functions allow to extract a table from a stream of tables (`tableFind`) and access its -columns (`getColumn`) and records (`getRecord`). +columns (`getColumn`) and records (`getRecord`). The example below provides an overview of these functions, further information can be found in the paragraphs below. ``` diff --git a/libflux/flux-core/src/ast/mod.rs b/libflux/flux-core/src/ast/mod.rs index 963c2229a3..7cd4b52a3a 100644 --- a/libflux/flux-core/src/ast/mod.rs +++ b/libflux/flux-core/src/ast/mod.rs @@ -323,6 +323,18 @@ pub enum PropertyKey { StringLit(StringLit), } +impl From for PropertyKey { + fn from(id: Identifier) -> Self { + Self::Identifier(id) + } +} + +impl From for PropertyKey { + fn from(lit: StringLit) -> Self { + Self::StringLit(lit) + } +} + impl PropertyKey { /// Returns the [`BaseNode`] for a [`PropertyKey`]. pub fn base(&self) -> &BaseNode { @@ -762,7 +774,7 @@ pub struct PropertyType { #[serde(default)] #[serde(flatten)] pub base: BaseNode, - pub name: Identifier, + pub name: PropertyKey, pub monotype: MonoType, } diff --git a/libflux/flux-core/src/ast/tests.rs b/libflux/flux-core/src/ast/tests.rs index fccd7a05b8..8ff4128d42 100644 --- a/libflux/flux-core/src/ast/tests.rs +++ b/libflux/flux-core/src/ast/tests.rs @@ -586,7 +586,8 @@ fn test_json_record() { name: Identifier { base: BaseNode::default(), name: "A".to_string(), - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode::default(), name: Identifier { @@ -597,10 +598,10 @@ fn test_json_record() { }], }); let serialized = serde_json::to_string(&n).unwrap(); - assert_eq!( - serialized, - r#"{"type":"RecordType","tvar":{"name":"A"},"properties":[{"name":{"name":"A"},"monotype":{"type":"NamedType","name":{"name":"int"}}}]}"# - ); + expect_test::expect![[ + r#"{"type":"RecordType","tvar":{"name":"A"},"properties":[{"name":{"type":"Identifier","name":"A"},"monotype":{"type":"NamedType","name":{"name":"int"}}}]}"# + ]] + .assert_eq(&serialized); let deserialized: MonoType = serde_json::from_str(serialized.as_str()).unwrap(); assert_eq!(deserialized, n) } @@ -626,7 +627,8 @@ fn test_json_record_no_tvar() { name: Identifier { base: BaseNode::default(), name: "A".to_string(), - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode::default(), name: Identifier { @@ -637,10 +639,9 @@ fn test_json_record_no_tvar() { }], }); let serialized = serde_json::to_string(&n).unwrap(); - assert_eq!( - serialized, - r#"{"type":"RecordType","properties":[{"name":{"name":"A"},"monotype":{"type":"NamedType","name":{"name":"int"}}}]}"# - ); + expect_test::expect![[ + r#"{"type":"RecordType","properties":[{"name":{"type":"Identifier","name":"A"},"monotype":{"type":"NamedType","name":{"name":"int"}}}]}"# + ]].assert_eq(&serialized); let deserialized: MonoType = serde_json::from_str(serialized.as_str()).unwrap(); assert_eq!(deserialized, n) } diff --git a/libflux/flux-core/src/ast/walk/mod.rs b/libflux/flux-core/src/ast/walk/mod.rs index 25a524ab04..6d8daf201c 100644 --- a/libflux/flux-core/src/ast/walk/mod.rs +++ b/libflux/flux-core/src/ast/walk/mod.rs @@ -468,7 +468,7 @@ where } }, Node::PropertyType(n) => { - walk(v, Node::Identifier(&n.name)); + walk(v, Node::from_property_key(&n.name)); walk(v, Node::MonoType(&n.monotype)); } Node::ParameterType(n) => match n { diff --git a/libflux/flux-core/src/formatter/mod.rs b/libflux/flux-core/src/formatter/mod.rs index 8f20a792cc..0d11c330eb 100644 --- a/libflux/flux-core/src/formatter/mod.rs +++ b/libflux/flux-core/src/formatter/mod.rs @@ -449,7 +449,7 @@ impl<'doc> Formatter<'doc> { n.properties.iter().map(|p| { docs![ arena, - self.format_identifier(&p.name), + self.format_property_key(&p.name), ": ", self.format_monotype(&p.monotype), ] diff --git a/libflux/flux-core/src/parser/mod.rs b/libflux/flux-core/src/parser/mod.rs index 49a86d0a06..f5088ae6fb 100644 --- a/libflux/flux-core/src/parser/mod.rs +++ b/libflux/flux-core/src/parser/mod.rs @@ -104,17 +104,49 @@ impl<'input> Parser<'input> { // expect will check if the next token is `exp` and error if it is not in either case the token // is consumed and returned fn expect(&mut self, exp: TokenType) -> Token { + self.expect_one_of(&[exp]) + } + + fn expect_one_of(&mut self, exp: &[TokenType]) -> Token { + fn one_of(expected_tokens: &[TokenType]) -> String { + match expected_tokens.len() { + 0 => "".to_string(), + 1 => expected_tokens[0].to_string(), + _ => { + use std::fmt::Write; + + let mut buf = String::new(); + + for (i, exp) in expected_tokens.iter().enumerate() { + let s = match i { + 0 => "", + _ if i < expected_tokens.len() - 1 => ",", + // Last expected message to be written + _ => " or", + }; + write!(buf, "{} `{}`", s, exp).unwrap(); + } + + buf + } + } + } + let t = self.scan(); match t.tok { - tok if tok == exp => (), + tok if exp.contains(&tok) => (), TokenType::Eof => { - self.errs.push(format!("expected {}, got EOF", exp)); + self.errs.push(format!("expected {}, got EOF", one_of(exp))); } _ => { let pos = ast::Position::from(&t.start_pos); self.errs.push(format!( "expected {}, got {} ({}) at {}:{}", - exp, t.tok, t.lit, pos.line, pos.column, + one_of(exp), + t.tok, + t.lit, + pos.line, + pos.column, )); } } @@ -690,19 +722,27 @@ impl<'input> Parser<'input> { let t = self.peek(); let properties = match t.tok { - TokenType::Ident => { - let identifier = self.parse_identifier(); - let t = self.peek(); - match t.tok { - TokenType::Colon => self.parse_property_type_list_suffix(identifier), - TokenType::Ident if t.lit == "with" => { - id = Some(identifier); - self.expect(TokenType::Ident); - self.parse_property_type_list() + TokenType::Ident | TokenType::String => { + let property_key = self.parse_property_key(); + + match property_key { + PropertyKey::Identifier(identifier) => { + let t = self.peek(); + match t.tok { + TokenType::Colon => self.parse_property_type_list_suffix( + PropertyKey::Identifier(identifier), + ), + TokenType::Ident if t.lit == "with" => { + id = Some(identifier); + self.expect(TokenType::Ident); + self.parse_property_type_list() + } + // This is an error, but the token is not consumed so the error gets + // caught below with self.close(TokenType::RBrace) + _ => vec![], + } } - // This is an error, but the token is not consumed so the error gets - // caught below with self.close(TokenType::RBrace) - _ => vec![], + PropertyKey::StringLit(_) => self.parse_property_type_list_suffix(property_key), } } // The record is empty @@ -718,10 +758,10 @@ impl<'input> Parser<'input> { }) } fn parse_property_type_list(&mut self) -> Vec { - let id = self.parse_identifier(); + let id = self.parse_property_key(); self.parse_property_type_list_suffix(id) } - fn parse_property_type_list_suffix(&mut self, id: Identifier) -> Vec { + fn parse_property_type_list_suffix(&mut self, id: PropertyKey) -> Vec { let mut properties = Vec::::with_capacity(5); let p = self.parse_property_type_suffix(id); properties.push(p); @@ -738,15 +778,15 @@ impl<'input> Parser<'input> { properties } fn parse_property_type(&mut self) -> PropertyType { - let identifier = self.parse_identifier(); // identifier - self.parse_property_type_suffix(identifier) + let key = self.parse_property_key(); + self.parse_property_type_suffix(key) } - fn parse_property_type_suffix(&mut self, id: Identifier) -> PropertyType { + fn parse_property_type_suffix(&mut self, name: PropertyKey) -> PropertyType { self.expect(TokenType::Colon); // : let monotype = self.parse_monotype(); PropertyType { - base: self.base_node_from_others(&id.base, monotype.base()), - name: id, + base: self.base_node_from_others(name.base(), monotype.base()), + name, monotype, } } @@ -1485,6 +1525,17 @@ impl<'input> Parser<'input> { } } } + fn parse_property_key(&mut self) -> PropertyKey { + let t = self.expect_one_of(&[TokenType::Ident, TokenType::String]); + match t.tok { + TokenType::Ident => PropertyKey::Identifier(Identifier { + base: self.base_node_from_token(&t), + name: t.lit, + }), + TokenType::String => PropertyKey::StringLit(self.new_string_literal(t)), + _ => unreachable!(), + } + } fn parse_identifier(&mut self) -> Identifier { let t = self.expect(TokenType::Ident); Identifier { @@ -1538,6 +1589,9 @@ impl<'input> Parser<'input> { } fn parse_string_literal(&mut self) -> StringLit { let t = self.expect(TokenType::String); + self.new_string_literal(t) + } + fn new_string_literal(&mut self, t: Token) -> StringLit { match strconv::parse_string(t.lit.as_str()) { Ok(value) => StringLit { base: self.base_node_from_token(&t), diff --git a/libflux/flux-core/src/parser/tests/types.rs b/libflux/flux-core/src/parser/tests/types.rs index afe46a9a6c..95f4cf94af 100644 --- a/libflux/flux-core/src/parser/tests/types.rs +++ b/libflux/flux-core/src/parser/tests/types.rs @@ -567,7 +567,8 @@ fn test_parse_record_type_only_properties() { location: loc.get(1, 2, 1, 3), ..BaseNode::default() }, - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode { location: loc.get(1, 4, 1, 7), @@ -593,7 +594,8 @@ fn test_parse_record_type_only_properties() { location: loc.get(1, 9, 1, 10), ..BaseNode::default() }, - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode { location: loc.get(1, 11, 1, 15), @@ -613,6 +615,212 @@ fn test_parse_record_type_only_properties() { ) } +#[test] +fn test_parse_record_type_string_literal_property() { + let mut p = Parser::new(r#"{"a":int, b:uint}"#); + let parsed = p.parse_record_type(); + expect_test::expect![[r#" + Record( + RecordType { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 1, + }, + end: Position { + line: 1, + column: 18, + }, + source: Some( + "{\"a\":int, b:uint}", + ), + }, + comments: [], + errors: [], + }, + tvar: None, + properties: [ + PropertyType { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 2, + }, + end: Position { + line: 1, + column: 9, + }, + source: Some( + "\"a\":int", + ), + }, + comments: [], + errors: [], + }, + name: StringLit( + StringLit { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 2, + }, + end: Position { + line: 1, + column: 5, + }, + source: Some( + "\"a\"", + ), + }, + comments: [], + errors: [], + }, + value: "a", + }, + ), + monotype: Basic( + NamedType { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 6, + }, + end: Position { + line: 1, + column: 9, + }, + source: Some( + "int", + ), + }, + comments: [], + errors: [], + }, + name: Identifier { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 6, + }, + end: Position { + line: 1, + column: 9, + }, + source: Some( + "int", + ), + }, + comments: [], + errors: [], + }, + name: "int", + }, + }, + ), + }, + PropertyType { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 11, + }, + end: Position { + line: 1, + column: 17, + }, + source: Some( + "b:uint", + ), + }, + comments: [], + errors: [], + }, + name: Identifier( + Identifier { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 11, + }, + end: Position { + line: 1, + column: 12, + }, + source: Some( + "b", + ), + }, + comments: [], + errors: [], + }, + name: "b", + }, + ), + monotype: Basic( + NamedType { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 13, + }, + end: Position { + line: 1, + column: 17, + }, + source: Some( + "uint", + ), + }, + comments: [], + errors: [], + }, + name: Identifier { + base: BaseNode { + location: SourceLocation { + file: None, + start: Position { + line: 1, + column: 13, + }, + end: Position { + line: 1, + column: 17, + }, + source: Some( + "uint", + ), + }, + comments: [], + errors: [], + }, + name: "uint", + }, + }, + ), + }, + ], + }, + ) + "#]] + .assert_debug_eq(&parsed); +} + #[test] fn test_parse_record_type_trailing_comma() { let mut p = Parser::new(r#"{a:int,}"#); @@ -637,7 +845,8 @@ fn test_parse_record_type_trailing_comma() { location: loc.get(1, 2, 1, 3), ..BaseNode::default() }, - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode { location: loc.get(1, 4, 1, 7), @@ -1215,8 +1424,10 @@ fn test_parse_record_type_tvar_properties() { base: BaseNode { location: loc.get(1, 9, 1, 10), ..BaseNode::default() - }, - }, + } + .into(), + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode { location: loc.get(1, 11, 1, 14), @@ -1242,7 +1453,8 @@ fn test_parse_record_type_tvar_properties() { location: loc.get(1, 16, 1, 17), ..BaseNode::default() }, - }, + } + .into(), monotype: MonoType::Basic(NamedType { base: BaseNode { location: loc.get(1, 18, 1, 22), diff --git a/libflux/flux-core/src/semantic/convert.rs b/libflux/flux-core/src/semantic/convert.rs index 7e1e85c77f..57fe37d31c 100644 --- a/libflux/flux-core/src/semantic/convert.rs +++ b/libflux/flux-core/src/semantic/convert.rs @@ -578,8 +578,12 @@ impl<'a> Converter<'a> { } }; for prop in &rec.properties { + let name = match &prop.name { + ast::PropertyKey::Identifier(id) => &id.name, + ast::PropertyKey::StringLit(lit) => &lit.value, + }; let property = types::Property { - k: types::Label::from(self.symbols.lookup(&prop.name.name)), + k: types::Label::from(self.symbols.lookup(name)), v: self.convert_monotype(&prop.monotype, tvars), }; r = MonoType::from(types::Record::Extension { diff --git a/libflux/flux-core/src/semantic/tests.rs b/libflux/flux-core/src/semantic/tests.rs index 478397538d..f5bd96a2a6 100644 --- a/libflux/flux-core/src/semantic/tests.rs +++ b/libflux/flux-core/src/semantic/tests.rs @@ -2817,6 +2817,21 @@ fn record_with_scoped_labels() { } } +#[test] +fn record_with_literal_fields() { + test_infer! { + env: map![ + "r" => r##"{ "with spaces": int, "#$%": string }"##, + ], + src: r##" + o = {x: r["with spaces"], y: r["#$%"]} + "##, + exp: map![ + "o" => "{x: int , y: string}", + ], + } +} + #[test] fn pseudo_complete_query() { // TODO(algow): re-introduce equality constraints for binary comparison operators diff --git a/libflux/go/libflux/buildinfo.gen.go b/libflux/go/libflux/buildinfo.gen.go index abf33712b2..a10abaf585 100644 --- a/libflux/go/libflux/buildinfo.gen.go +++ b/libflux/go/libflux/buildinfo.gen.go @@ -16,18 +16,18 @@ var sourceHashes = map[string]string{ "libflux/Cargo.toml": "91ac4e8b467440c6e8a9438011de0e7b78c2732403bb067d4dd31539ac8a90c1", "libflux/flux-core/Cargo.toml": "4a23fdfcf31b3f1f91208808e036bfdf66c2ac6b3724c61cdb3f17b05d2f9bb1", "libflux/flux-core/src/ast/check/mod.rs": "47e06631f249715a44c9c8fa897faf142ad0fa26f67f8cfd5cd201e82cb1afc8", - "libflux/flux-core/src/ast/mod.rs": "5e95066833895d8b3359a8f84f2cf025e1cf50483678e6f6c5dfc9101d1d5c3a", - "libflux/flux-core/src/ast/walk/mod.rs": "a49efe67d796351d29af7b4d1ad5b400b21dd0ddfff5f0d7af6d915294d3edcb", + "libflux/flux-core/src/ast/mod.rs": "87555cc82f6adb1606c3c6dea4ef91a3b0763b6a64eb2dfbc682bb67f4957d9c", + "libflux/flux-core/src/ast/walk/mod.rs": "aa75319f33938c43db4b907249bc67508f00fba02e45ae175e939e96e178298b", "libflux/flux-core/src/bin/README.md": "c1245a4938c923d065647b4dc4f7e19486e85c93d868ef2f7f47ddff62ec81df", "libflux/flux-core/src/bin/fluxc.rs": "bf275289e690236988049fc0a07cf832dbac25bb5739c02135b069dcdfab4d0f", "libflux/flux-core/src/bin/fluxdoc.rs": "bad4b12bcf4a8bc1a94cb37cda004bf7fb593abf3f0c6c3a2af6fabc60337c5d", "libflux/flux-core/src/doc/example.rs": "6414756b3c74df1b58fdb739592e74ded3f89d85d647809333f72a3f6aad146f", "libflux/flux-core/src/doc/mod.rs": "e8aae31bc4a60836d7258a03f6827b76e9fad44c889db6a21d6679c26818f2d2", "libflux/flux-core/src/errors.rs": "5ee16ec2fd281f7c115ba0b7bcf3380749f64b7a282eb0fab7e6afe9858f3d4d", - "libflux/flux-core/src/formatter/mod.rs": "783339cfa1cca754a5469754fad4afebd74b8ebe3164e2ee76a0674e6fec569d", + "libflux/flux-core/src/formatter/mod.rs": "2b37befce1624318d8eafd1547881d342daa3cb66bf12967689006ffd2eafaf3", "libflux/flux-core/src/lib.rs": "443aed16dd600eecc1ffbee2d2dead6788e796cd5a278eb5dafb123843b8959e", "libflux/flux-core/src/map.rs": "342c1cc111d343f01b97f38be10a9f1097bdd57cdc56f55e92fd3ed5028e6973", - "libflux/flux-core/src/parser/mod.rs": "a2861ad9b5e5ed430c0927d7fd975e79f13b128a1e0863cb9d82059cdd38384a", + "libflux/flux-core/src/parser/mod.rs": "12d3d61a78cfa466d9955db126b6da15f890a6010c8b7034522a88bca7fd4e6c", "libflux/flux-core/src/parser/strconv.rs": "84d24110f8af4a40ff7584cb0a41e8a1f8949d19c1ec329719057c5302f7615d", "libflux/flux-core/src/scanner/mod.rs": "297809a7b5778363a490bc4e08add05005bdc50d7c8cd59f27390f6316aba69e", "libflux/flux-core/src/scanner/scanner.rl": "e3755aed899244461e8b2a05a87ab41a89fe3d66d28f60c25ad9895f26675ba8", @@ -37,7 +37,7 @@ var sourceHashes = map[string]string{ "libflux/flux-core/src/scanner/unicode.rl.COPYING": "6cf2d5d26d52772ded8a5f0813f49f83dfa76006c5f398713be3854fe7bc4c7e", "libflux/flux-core/src/semantic/bootstrap.rs": "cc82dbe982672fc4fe34f4268e78aae1e0e1869f214233f537e070a9e34b7737", "libflux/flux-core/src/semantic/check.rs": "d0228a0a8176a5360d88cfe48acb1ffd036817b6aaadfadb94af446492603305", - "libflux/flux-core/src/semantic/convert.rs": "e6d4d01d887434b73dc2663eeb88294b67a64d15ffed7fb61af0e401fb5670ea", + "libflux/flux-core/src/semantic/convert.rs": "9e85a90f9f64878b8e77077b65d52409365cf82b62c09949b808cbaec16031ee", "libflux/flux-core/src/semantic/env.rs": "db424704eece030a76dee968c3d1959e94d21b7f7400c35f8e233cb3089218a4", "libflux/flux-core/src/semantic/flatbuffers/mod.rs": "a259b3c3b80cb6c19a38877df27d999f82e286629c132fd8abed5f95da5ee95f", "libflux/flux-core/src/semantic/flatbuffers/semantic_generated.rs": "0f54e652b45b1515c71160a555200a062f9c3c5a681a14f7a0ac36cc19dd4e6f",