diff --git a/docs/mappings/abap.md b/docs/mappings/abap.md
index 5c82b0c2..02db3d1c 100644
--- a/docs/mappings/abap.md
+++ b/docs/mappings/abap.md
@@ -1,5 +1,5 @@
---
-sidebar_position: 5
+sidebar_position: 2
description: "ABAP Type System"
---
diff --git a/docs/mappings/apache-spark.md b/docs/mappings/apache-spark.md
index 35888836..d65b2580 100644
--- a/docs/mappings/apache-spark.md
+++ b/docs/mappings/apache-spark.md
@@ -7,22 +7,24 @@ description: "CSN Interop types to Apache Spark types."
> DRAFT This mapping definition is work in progress and may be subject to further change.
- Spark Data Types coming from here: [Link](https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/data_types.html)
-- DataSphere Data Types coming from here: [Link](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/7b1dc6e0fad147de8e50aa8dc4744aa3.html?locale=en-US)
-|CDS | Spark / Delta Lake | Datasphere | Comment | Spark format |
-|--- |------------------- |----------- |-------- |--------------|
-|`cds.Boolean`| BOOLEAN | `cds.Boolean`| | |
-|`cds.String` (length ) | STRING | `cds.String` | Datasphere Logic: IF `cds.String(length = undefined)` THEN `cds.String(length = 5000)` | |
-|`cds.LargeString` (length ) | STRING | `cds.LargeString` | | |
-|`cds.Integer`| INT | `cds.Integer` | | |
-|`cds.Integer64`| BIGINT | `cds.Integer64` | | |
-|`cds.Decimal` (precision = p, scale = s)| DECIMAL(p,s) | `cds.Decimal` | Datasphere Logic: IF `cds.Decimal(p < 17)` THEN `cds.Decimal(p = 17)` | |
-|`cds.Decimal` (precision = p, scale = floating) | ***not supported*** | `cds.Decimal` | Decimal with scale = floating is not supported in spark | |
-|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) | `cds.Decimal(34, 4)` | `cds.Decimal(34, 4)` | Since spark does not support `cds.DecimalFloat` we use cds.Decimal(34,4) as compromise for now | |
-|`cds.Double`| DOUBLE | `cds.Double` | | |
-|`cds.Date`| DATE | `cds.Date` | | "yyyyMMdd" |
-|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true`| STRING | `cds.String(6)` or `cds.String(12)` | Data is in format `HHmmss` or `HH:mm:ss.SSS` - consumer must use the function to_time() to convert to `cds.Time`| |
-|`cds.DateTime` sec precision| TIMESTAMP | `cds.Timestamp` | | |
-|`cds.Timestamp` µs precision| TIMESTAMP | `cds.Timestamp` | | "yyyy-MM-dd'T'HH:mm:ss.SSSSSSS" |
-|`cds.UUID` + the annotation `@Semantics.uuid: true`| STRING (36) | `cds.UUID` | | | |
+|CDS | Spark / Delta Lake | Comment | Spark format |
+|--- |------------------- |-------- |--------------|
+|`cds.Boolean`| BOOLEAN| | |
+|`cds.String` (length ) | STRING | | |
+|`cds.LargeString` (length ) | STRING | | |
+|`cds.Integer`| INT | | |
+|`cds.Integer64`| BIGINT | | |
+|`cds.Decimal` (precision = p, scale = s) | DECIMAL(p,s) | | |
+|`cds.Decimal` (precision = p, scale = floating) | ***not supported*** | Decimal with scale = floating is not supported in spark | |
+|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) | `cds.Decimal(34, 4)` | Since spark does not support `cds.DecimalFloat` we use cds.Decimal(34,4) as compromise for now | |
+|`cds.Decimal` (no arguments) | ***not supported*** | | |
+|`cds.Double` | DOUBLE | | |
+|`cds.Date` | DATE | | "yyyyMMdd" |
+|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true` | STRING | Data is in format `HHmmss` or `HH:mm:ss.SSS` - consumer must use the function to_time() to convert to `cds.Time`| |
+|`cds.DateTime` sec precision | TIMESTAMP | | |
+|`cds.Timestamp` µs precision | TIMESTAMP | | "yyyy-MM-dd'T'HH:mm:ss.SSSSSSS" |
+|`cds.UUID` + the annotation `@Semantics.uuid: true` | STRING (36) | | |
+|hana.ST_GEOMETRY (in DSP, not in CDS) | STRING | CSN with type info | |
+|hana.ST_POINT | STRING | CSN with type info | |
diff --git a/docs/mappings/datasphere.md b/docs/mappings/datasphere.md
new file mode 100755
index 00000000..c014d47d
--- /dev/null
+++ b/docs/mappings/datasphere.md
@@ -0,0 +1,30 @@
+---
+sidebar_position: 4
+title: SAP Datasphere
+description: "CSN Interop types to SAP Datasphere types."
+---
+
+> DRAFT This mapping definition is work in progress and may be subject to further change.
+
+- Datasphere Data Types coming from here: [Link](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2f39104e5bd847919b8daee1580c4f68.html)
+
+
+|CDS | Datasphere | Comment |
+|--- |----------- |-------- |
+|`cds.Boolean`| `cds.Boolean`| |
+|`cds.String` (length ) | `cds.String` | Datasphere Logic: IF `cds.String(length = undefined)` THEN `cds.String(length = 5000)` |
+|`cds.LargeString` (length ) | `cds.LargeString` | |
+|`cds.Integer`| `cds.Integer` | |
+|`cds.Integer64`| `cds.Integer64` | |
+|`cds.Decimal` (precision = p, scale = s) | `cds.Decimal` | Datasphere Logic: IF `cds.Decimal(p < 17)` THEN `cds.Decimal(p = 17)` |
+|`cds.Decimal` (precision = p, scale = floating) | `cds.Decimal` | |
+|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) | `cds.Decimal(34, 4)` | |
+|`cds.Decimal` (no arguments) | DecimalFloat | |
+|`cds.Double` | `cds.Double` | |
+|`cds.Date` | `cds.Date` | |
+|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true` | `cds.String(6)` or `cds.String(12)` | |
+|`cds.DateTime` sec precision | `cds.Timestamp` | |
+|`cds.Timestamp` µs precision | `cds.Timestamp` | |
+|`cds.UUID` + the annotation `@Semantics.uuid: true` | `cds.UUID` | |
+|- | hana.ST_GEOMETRY | |
+|- | hana.ST_POINT | |
diff --git a/docs/mappings/hana.md b/docs/mappings/hana.md
new file mode 100755
index 00000000..b78096a9
--- /dev/null
+++ b/docs/mappings/hana.md
@@ -0,0 +1,32 @@
+---
+sidebar_position: 3
+title: SAP HANA
+description: "CSN Interop types to SAP HANA types."
+---
+
+> DRAFT This mapping definition is work in progress and may be subject to further change.
+
+- HANA data types coming from here: [Link](https://help.sap.com/docs/SAP_HANA_PLATFORM/4fe29514fd584807ac9f2a04f6754767/20a1569875191014b507cf392724b7eb.html?locale=en-US) and in context with CAP / OData here: [Link](https://cap.cloud.sap/docs/advanced/hana#hana-types)
+
+
+|CDS | HANA | Comment |
+|--- |----- |-------- |
+|`cds.Boolean`| BOOLEAN | |
+|`cds.String` (length ) | NVARCHAR(length) | |
+|`cds.LargeString` (length ) | NCLOB | |
+|`cds.Integer`| INTEGER | |
+|`cds.Integer64`| BIGINT | |
+|`cds.Decimal` (precision = p, scale = s) | DECIMAL(p,s) | |
+|`cds.Decimal` (precision = p, scale = floating) | DECIMAL | |
+|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) | DECIMAL(34,4) | |
+|`cds.Decimal` (no arguments) | DECIMAL | |
+|`cds.Double` | DOUBLE | |
+|`cds.Date` | DATE | |
+|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true` | TIME | Data is in format `HHmmss` or `HH:mm:ss.SSS` - consumer must use the function to_time() to convert to `cds.Time`|
+|`cds.DateTime` sec precision | TIMESTAMP | |
+|`cds.Timestamp` µs precision | TIMESTAMP | HANA with ns precision (precision loss) |
+|`cds.UUID` + the annotation `@Semantics.uuid: true` | NVARCHAR(36) | |
+|- | ST_GEOMETRY | |
+|- | ST_POINT | |
+|`cds.vector` | REAL_VECTOR | |
+|- | HALF_VECTOR | HANA half-precision vector type |
diff --git a/docs/mappings/index.mdx b/docs/mappings/index.mdx
old mode 100644
new mode 100755