New interop structure, types and precision information#114
New interop structure, types and precision information#114dritter-sap wants to merge 12 commits intoSAP:mainfrom
Conversation
|
|
|
Thank you for your contribution, it is greatly appreciated! May I suggest that when introducing new types, a new source, please start with a new document, see ABAP example, introducing first the types of the source system, mapping them into CDS types, and optionally provide their Spark equivalents, with respective cast instructions. Then, for any CDS types, we can introduce a mapping into a target type system. The Spark type document needs work that aligns with this strategy. We can work on this in a distinct separate PR. Thank you! |
|
@dritter-sap : Thanks for bringing this up. I think we need to discuss it in CSN Interop WS, we already did it to some degree. First, like @jalberti mentioned the types would first need to get introduced into the CSN Interop Types itself. But here I see a problem: Types like In most cases it would be safer to work with annotations here that could guide how a Database maps it to its own type. This is also how JSON Schema works: There are only a few real technical types that are completely stable. The rest is describes with "format" and via constraints like "maxValue". FYI, @andreasbalzar |
docs/mappings/apache-spark.md
Outdated
| hana.ST_GEOMETRY | STRING | `cds.String` | CSN with type info | | | | ||
| hana.ST_POINT | STRING | `cds.String` | CSN with type info | | | |
There was a problem hiding this comment.
We can't introduce hana.* types into the CSN interop specification, because they are by definition not Interoperable - they are HANA DB specific.
There was a problem hiding this comment.
that is reworked, kindly check.
| |`cds.Integer`| INT | | | | ||
| |`cds.Integer64`| BIGINT | | | | ||
| |`cds.Decimal` (precision = p, scale = s) | DECIMAL(p,s) | | | | ||
| |`cds.Decimal` (precision = p, scale = floating) | ***not supported*** | Decimal with scale = floating is not supported in spark | | |
There was a problem hiding this comment.
Our proposal is to support this with cds.Decimal (precision = p, scale = floating) gets mapped to STRING in delta. Do you want to add this in the current PR or a later one?
| |`cds.Decimal` (precision = p, scale = s) | DECIMAL(p,s) | | | | ||
| |`cds.Decimal` (precision = p, scale = floating) | ***not supported*** | Decimal with scale = floating is not supported in spark | | | ||
| |Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) | `cds.Decimal(34, 4)` | Since spark does not support `cds.DecimalFloat` we use cds.Decimal(34,4) as compromise for now | | | ||
| |`cds.Decimal` (no arguments) | ***not supported*** | | | |
There was a problem hiding this comment.
Same for cds.Decimal (no arguments)
| |`cds.Timestamp` µs precision | TIMESTAMP | | "yyyy-MM-dd'T'HH:mm:ss.SSSSSSS" | | ||
| |`cds.UUID` + the annotation `@Semantics.uuid: true` | STRING (36) | | | | ||
| |hana.ST_GEOMETRY (in DSP, not in CDS) | STRING | CSN with type info | | | ||
| |hana.ST_POINT | STRING | CSN with type info | | |
There was a problem hiding this comment.
new proposal to map cds.Vector to delta STRING.
| |`cds.Integer`| INT | | | | ||
| |`cds.Integer64`| BIGINT | | | | ||
| |`cds.Decimal` (precision = p, scale = s) | DECIMAL(p,s) | | | | ||
| |`cds.Decimal` (precision = p, scale = floating) | ***not supported*** | Decimal with scale = floating is not supported in spark | | |
There was a problem hiding this comment.
HANA type SMALLDECIMAL seem to be not in the list. But I think, it may be fine to not include for now as there is no CDS type associated with it and so anyway CAP consumer/producer of DP can not consume/produce such type.
We reworked the current interop as agreed into three parts (splitting spark and dsp):
and added a type mapping for hana