Skip to content

Commit

Permalink
Added more infos and pictures to README.md
Browse files Browse the repository at this point in the history
More information and about sub-libraries and maintainers has been added into the readme file. It also, somehow enhanced to be more readable and categrized. 

Signed-off-by: Hadi Adineh <122263902+hadi-adineh-ascs@users.noreply.github.com>
  • Loading branch information
adineh authored Sep 12, 2023
1 parent 0140fdf commit 98215a5
Showing 1 changed file with 209 additions and 33 deletions.
242 changes: 209 additions & 33 deletions profile/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,97 +22,273 @@ Learn more about the ENVITED research cluster of the Automotive Solution Center

We are looking forward to welcome you as member of our community!

## Sub-Libraries
# OpenMSL Sub-Libraries

Check failure on line 25 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Headings should be surrounded by blank lines [Expected: 1; Actual: 0; Below] [Context: "# OpenMSL Sub-Libraries"]

Check failure on line 25 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Multiple top-level headings in the same document [Context: "# OpenMSL Sub-Libraries"]
+ [SL1 - Perception Sensor Models](#sl1---perception-sensor-models)

Check failure on line 26 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Unordered list indentation [Expected: 0; Actual: 2]

Check failure on line 26 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Lists should be surrounded by blank lines [Context: "+ [SL1 - Perception Sensor Mod..."]
+ [SL2 - Traffic Participant Models](#sl2---traffic-participant-models)

Check failure on line 27 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Unordered list indentation [Expected: 0; Actual: 2]
+ [SL3 - Scenario Data](#sl3---scenario-data)

Check failure on line 28 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Unordered list indentation [Expected: 0; Actual: 2]
+ [SL4 - Static Environment Data](#sl4---static-environment-data)

Check failure on line 29 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Unordered list indentation [Expected: 0; Actual: 2]
+ [SL5 - Tooling](#sl5---tooling)

Check failure on line 30 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Unordered list indentation [Expected: 0; Actual: 2]

Check failure on line 31 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Trailing spaces [Expected: 0 or 2; Actual: 4]
<br>

### SL1 - Perception Sensor Models
---

Check failure on line 34 in profile/README.md

View workflow job for this annotation

GitHub Actions / markdown-lint

Trailing spaces [Expected: 0 or 2; Actual: 4]
<br>

## SL1 - Perception Sensor Models

This sub-library is a collection of [OSI](https://github.com/OpenSimulationInterface/open-simulation-interface) compliant sensor models according to the [OSMP](https://github.com/OpenSimulationInterface/osi-sensor-model-packaging) specification including a template repository
demonstrating the [Credible Simulation Process](https://setlevel.de/assets/forschungsergebnisse/Credible-Simulation-Process-v1.0.pdf) by running full scale [SSP](https://ssp-standard.org/) based co-simulations in the CI pipeline.

Initiated: 2022-07-25

#### Maintainer
### Maintainer

- [![avatar](https://images.weserv.nl/?url=avatars.githubusercontent.com/u/78017112?v=4&h=50&w=50&fit=cover&mask=circle&maxage=7d) Lukas Elster](https://github.com/LukasElster) (FZD TU Darmstadt)

- [![avatar](https://images.weserv.nl/?url=avatars.githubusercontent.com/u/27010086?v=4&h=50&w=50&fit=cover&mask=circle&maxage=7d) Clemens Linnhoff](https://github.com/ClemensLinnhoff) (Persival GmbH)

- [![avatar](https://images.weserv.nl/?url=avatars.githubusercontent.com/u/122266565?v=4&h=50&w=50&fit=cover&mask=circle&maxage=7d) Jürgen Wille](https://github.com/FM-juergenW) (FrontMod GmbH)

### Repositories

- #### [sl-1-0-sensor-model-repository-template](https://github.com/openMSL/sl-1-0-sensor-model-repository-template)

Enter a short description of the model.
What is the purpose of the model?
What is the general modeling approach?
What inputs does the model need and what outputs does it generate?

< Eye-catcher Image >

<img src="https://github.com/openMSL/sl-1-0-sensor-model-repository-template/raw/main/doc/img/model_video.gif" width="800" />

more info: [click here](https://github.com/openMSL/sl-1-0-sensor-model-repository-template)
<br>

- #### [sl-1-1-reflection-based-radar-object-model](https://github.com/openMSL/sl-1-1-reflection-based-radar-object-model)

<img align="right" src="https://gitlab.com/tuda-fzd/perception-sensor-modeling/object-based-generic-perception-object-model/uploads/17c84e9ec0acf0fac2e35855f038ad0b/fzdlogo.jpg" width="100" />

This model is a Reflection Based Radar Model based on the [Modular OSMP Framework](https://gitlab.com/tuda-fzd/perception-sensor-modeling/modular-osmp-framework) by FZD.
It is a highly parameterizable sensor system model including detection calculation and object tracking simulation.
The model received radar reflection data calculated in a simulation tool beforehand e.g. with ray tracing.
The model outputs are radar detections and detected moving objects.

<img src="https://github.com/openMSL/sl-1-1-reflection-based-radar-object-model/raw/main/doc/img/model_video.gif" width="800" />

more info: [click here](https://github.com/openMSL/sl-1-1-reflection-based-radar-object-model)
<br>

- #### [sl-1-2-reflection-based-lidar-object-model](https://github.com/openMSL/sl-1-2-reflection-based-lidar-object-model)

The current version of the model is build on the enhancements to the Open Simulation Interface from the publicly funded SETLevel project.
It is therefore dependent on the non-standard [SL OSI](https://gitlab.setlevel.de/open/osi) and not [ASAM OSI](https://github.com/OpenSimulationInterface/open-simulation-interface).

<img align="right" src="https://github.com/openMSL/sl-1-2-reflection-based-lidar-object-model/raw/main/doc/img/fzd_logo.jpg" width="100" />

This is the FZD Reflection Based Lidar Model based on the FZD OSI Sensor Model Packaging Framework.
It is a highly parameterizable sensor system model including detection calculation and object tracking simulation.
The model gets lidar reflection calculated in a simulation tool beforehand e.g. with ray tracing.
The model outputs are lidar detections and detected moving objects.<br>

<img src="https://github.com/openMSL/sl-1-2-reflection-based-lidar-object-model/raw/main/doc/img/model_video.gif" width="800" />

more info: [click here](https://github.com/openMSL/sl-1-2-reflection-based-lidar-object-model)
<br>

- #### [sl-1-3-object-based-generic-perception-object-model](https://github.com/openMSL/sl-1-3-object-based-generic-perception-object-model)

<img align="right" src="https://gitlab.com/tuda-fzd/perception-sensor-modeling/object-based-generic-perception-object-model/uploads/17c84e9ec0acf0fac2e35855f038ad0b/fzdlogo.jpg" width="100" />

This model is a highly parameterizable generic perception sensor and tracking model.
It can be parameterized as a Lidar or a Radar.
The model is based on object lists and all modeling is performed on object level.
It includes typical sensor artifacts like soft FoV transitions, different detection ranges for different targets,
occlusion effects depending on the sensor technology as well as simulation of tracking behavior.
The model output are object lists for OSI SenorData moving objects.

The architecture of the model as well as the parameterization structure are designed to be as generic as possible
to fit both radar and lidar sensors to utilize similarities in signal propagation and signal processing in both technologies.
This way, the model can be parameterized to model different kinds of lidar and radar sensors.
To give an example: You can set an irradiation pattern for the modeled sensor.
Depending on the sensor technology this can either be an antenna gain pattern for radar or a beam pattern for lidar.

<img src="https://github.com/openMSL/sl-1-3-object-based-generic-perception-object-model/raw/main/doc/img/model_video.gif" width="800" />

more info: [click here](https://github.com/openMSL/sl-1-3-object-based-generic-perception-object-model)
<br>

- #### [sl-1-4-object-based-camera-object-model](https://github.com/openMSL/sl-1-4-object-based-camera-object-model)

This model is a parameterizable object based video perception sensor and tracking model using the interface OSI.
The model was developed in the project SetLevel by Bosch. The model should simulate some basic video typical effects in a phenomenological way.
The "object based camera object model" is based on object lists and all modeling is performed on object level. The model output are object lists for OSI SenorData moving and stationary objects.

The outer layer of the model is the OSI Sensor Model Packaging (OSMP).
It specifies ways in which models using the Open Simulation Interface (OSI) are to be packaged for their use in simulation environments using FMI 2.0.
For more detailed information see the official documentation.

<img src="https://github.com/openMSL/sl-1-4-object-based-camera-object-model/raw/main/doc/img/OSMPCameraSensor_Demo.gif" width="800" />

more info: [click here](https://github.com/openMSL/sl-1-4-object-based-camera-object-model)
<br>

- #### [sl-1-5-sensor-model-testing](https://github.com/openMSL/sl-1-5-sensor-model-testing)

This repository contains FMUs for automated testing, verification and validation for open-source perception sensor models.

- [Lukas Elster](https://github.com/LukasElster) (FZD TU Darmstadt)
- [Clemens Linnhoff](https://github.com/ClemensLinnhoff) (Persival GmbH)
- [Jürgen Wille](https://github.com/FM-juergenW) (FrontMod GmbH)
Further information can be found in [the repository](https://github.com/openMSL/sl-1-5-sensor-model-testing), the READMEs of the respective folders, and the documentation below:
- FMU collection for sensor model testing:
- [OSI Check FMU](src/osi-field-checker/)<br>
This FMU checks if fields are missing in a received SensorData.
- [SRMD Validator](src/srmd-validator/)<br>
This python script looks for SRMD files and validates them.<br>

#### Repositories
<br>

- [sl-1-0-sensor-model-repository-template](https://github.com/openMSL/sl-1-0-sensor-model-repository-template)
- [sl-1-1-reflection-based-radar-object-model](https://github.com/openMSL/sl-1-1-reflection-based-radar-object-model)
- [sl-1-2-reflection-based-lidar-object-model](https://github.com/openMSL/sl-1-2-reflection-based-lidar-object-model)
- [sl-1-3-object-based-generic-perception-object-model](https://github.com/openMSL/sl-1-3-object-based-generic-perception-object-model)
- [sl-1-4-object-based-camera-object-model](https://github.com/openMSL/sl-1-4-object-based-camera-object-model)
- [sl-1-5-sensor-model-testing](https://github.com/openMSL/sl-1-5-sensor-model-testing)
---
<br>

### SL2 - Traffic Participant Models
## SL2 - Traffic Participant Models

This sub-library is a set of OSI compliant traffic participant models, which include pedestrian models, SSP based ALKS systems, automated road users and others to demonstrate closed loop simulations in combination with other sub-libraries utilizing open-source simulators such as [esmini](https://github.com/esmini/esmini).

Initiated: Call for participation. Get engaged [hello@envited.market](mailto:hello@envited.market)

#### Maintainer
### Maintainer

- TBD
- TBD
- TBD

#### Repositories
### Repositories

- [sl-2-0-traffic-participant-model-repository-template](https://github.com/openMSL/sl-2-0-traffic-participant-model-repository-template)
- #### [sl-2-0-traffic-participant-model-repository-template](https://github.com/openMSL/sl-2-0-traffic-participant-model-repository-template)

### SL3 - Scenario Data
<br>

---
<br>

## SL3 - Scenario Data

This sub-library contains example scenario data following the [ASAM OpenSCENARIO](https://www.asam.net/standards/detail/openscenario/) standard to provide interpretations for legislative documents such as the UN Regulation No. 157 in order to discuss them in the community.
In addition, the best practices to establish quality gates for scenario databases to clearly show the quality of scenario data are shown.

Initiated: Call for participation. Get engaged [hello@envited.market](mailto:hello@envited.market)

#### Maintainer
### Maintainer

- TBD (BMW AG)
- TBD
- TBD

#### Repositories
### Repositories

- #### [sl-3-1-osc-alks-scenarios](https://github.com/asam-oss/OSC-ALKS-scenarios)

The here provided 15 concrete parametrized test scenarios are derived from the 6 subject areas analogous to Annex 5, Chapter 4.1-4.6 as an initial attempt to clarify the described set of functional scenarios.

Each concrete scenario is complemented by a variation file to form a logical scenario, which then represents a set of concrete scenarios. By applying the parameter variation defined in the variation files to the parameters in the concrete scenarios, multiple concrete scenarios can be derived or generated to cover the required scenario space.

The focus of the here provided scenarios is on securing the planning aspects of an "Automated Lane Keeping System". By extending the scenarios with environmental conditions (e.g. light, rain or wind) or references to e.g. 3D models, aspects of sensor and actuator technology could also be simulated and validated.

more info: [click here](https://github.com/asam-oss/OSC-ALKS-scenarios)
<br>

- [sl-3-1-osc-alks-scenarios](https://github.com/asam-oss/OSC-ALKS-scenarios)
<br>

### SL4 - Static Environment Data
---
<br>

## SL4 - Static Environment Data

The German research project [GaiaX 4 PLC-AAD](https://www.gaia-x4plcaad.info/) develops quality metrics and tools to evaluate the successful integration of [ASAM OpenDRIVE](https://www.asam.net/standards/detail/opendrive) maps
with e.g. [glTF](https://www.khronos.org/gltf/) 3D models and their respective material data extentions.

Initiated: Call for participation. Get engaged [hello@envited.market](mailto:hello@envited.market)

#### Maintainer
### Maintainer

- TBD (BMW AG)
- TBD
- TBD

#### Repositories
### Repositories

- In discussion

### SL5 - Tooling
<br>

---
<br>

## SL5 - Tooling


This sub-library contains various tools to import, export, analyze and visualize co-simulation data.

Initiated: Call for participation. Get engaged [hello@envited.market](mailto:hello@envited.market)

#### Maintainer
### Maintainer

- TBD (Persival GmbH)
- TBD
- TBD

#### Repositories
### Repositories

- #### [sl-5-1-srmd-validator](https://github.com/openMSL/sl-5-1-srmd-validator)

This python code is meant to be used in a CI pipeline, e.g. a GitHub Action.
It looks for SRMD files in the root directory of a repository, this repo is cloned into.
The found SRMD files are validated against the SRMD schema from [SSPTraceability](https://github.com/PMSFIT/SSPTraceability/).

more info: [click here](https://github.com/openMSL/sl-5-1-srmd-validator)
<br>

- #### [sl-5-2-osi-field-checker](https://github.com/openMSL/sl-5-2-osi-field-checker)

This FMU checks if fields are missing in a received SensorData.
It is meant to be used in a co-simulation connected to the output of the model under test.
It will output missing osi fields in the format of GitHub annotations, so that it can be used directly in a GitHub CI pipeline.
The image below shows an example of a failed pipeline due to missing OSI fields in the SensorData.

<img src="https://github.com/openMSL/sl-5-2-osi-field-checker/raw/main/doc/osi-field-checker-output.png" width="800" alt="OSI Field Checker in CI Pipeline"/>

more info: [click here](https://github.com/openMSL/sl-5-2-osi-field-checker)
<br>

- #### [sl-5-3-osmp-network-proxy](https://github.com/openMSL/sl-5-3-osmp-network-proxy)

This Network Proxy FMU can receive SensorView and SensorData via TCP/IP using ZeroMQ. The received data is directly given to the FMU output. The proxy can also send SensorView oder SensorData received as FMU input via TCP/IP to a given IP address and port.

more info: [click here](https://github.com/openMSL/sl-5-3-osmp-network-proxy)
<br>

- #### [sl-5-4-standalone-osi-trace-file-player](https://github.com/openMSL/sl-5-4-standalone-osi-trace-file-player)

This mini application can read a binary ASAM OSI trace file (SensorData or SensorView) and send it step by step via TCP using ZeroMQ.

more info: [click here](https://github.com/openMSL/sl-5-4-standalone-osi-trace-file-player)
<br>

- #### [sl-5-5-osi-trace-file-player](https://github.com/openMSL/sl-5-5-osi-trace-file-player)

This [FMU](https://fmi-standard.org/) is able to play binary OSI trace files.
The folder containing the trace files has to be passed as FMI parameter _trace_path_.
The trace file player is build according to the [ASAM Open simulation Interface (OSI)](https://github.com/OpenSimulationInterface/open-simulation-interface) and the [OSI Sensor Model Packaging (OSMP)](https://github.com/OpenSimulationInterface/osi-sensor-model-packaging) examples.

more info: [click here](https://github.com/openMSL/sl-5-5-osi-trace-file-player)
<br>

- #### [sl-5-6-osi-trace-file-writer](https://github.com/openMSL/sl-5-6-osi-trace-file-writer)

This [FMU](https://fmi-standard.org/) is able to write binary OSI SensorData trace files.
The folder the trace files shall be written to has to be passed as FMI parameter _trace_path_.
The trace file writer is build according to
the [ASAM Open simulation Interface (OSI)](https://github.com/OpenSimulationInterface/open-simulation-interface) and
the [OSI Sensor Model Packaging (OSMP)](https://github.com/OpenSimulationInterface/osi-sensor-model-packaging) examples.

- [sl-5-1-srmd-validator](https://github.com/openMSL/sl-5-1-srmd-validator)
- [sl-5-2-osi-field-checker](https://github.com/openMSL/sl-5-2-osi-field-checker)
- [sl-5-3-osmp-network-proxy](https://github.com/openMSL/sl-5-3-osmp-network-proxy)
- [sl-5-4-standalone-osi-trace-file-player](https://github.com/openMSL/sl-5-4-standalone-osi-trace-file-player)
- [sl-5-5-osi-trace-file-player](https://github.com/openMSL/sl-5-5-osi-trace-file-player)
- [sl-5-6-osi-trace-file-writer](https://github.com/openMSL/sl-5-6-osi-trace-file-writer)
more info: [click here](https://github.com/openMSL/sl-5-6-osi-trace-file-writer)
<br>

0 comments on commit 98215a5

Please sign in to comment.