The 12th meeting of the expert group “Blockchain Technology in Interorganisational Collaboration” took place over lunch on the 29th of April.
First, the members were informed about the opportunities of the innovation process of the databooster. The innovation process gives the benefit of exploring and testing ideas together with experts in the field. Moreover, one can get support for project funding. The iterative process involves different steps such as scouting for ideas, setting up a call, shaping and re-shaping the challenge and ultimately setting up a deep-dive workshop (https://databooster.ch/expertise/).
After these introductory remarks, the expert group hosted Daniel Rutishauser from inacta AG to give an overview on the new DLT law and future trends in the blockchain sector. Daniel presented inacta’s hypotheses to four key areas of the blockchain space: crypto assets, token economies, DLT solutions, and DLT base layer. In particular in the area of crypto assets, Daniel explained that Switzerland has a competitive advantage due to the new favorable DLT law and he expects that many of the future crypto assets will be issued and traded in Switzerland. Besides many other topics, the current hype around NFTs (non-fungible tokens) and its effect on the digital art market gave rise to much discussion among the experts. The members could not agree on the fundamentals for the immense prices that this new market has realised. Considering the number of open questions, NFTs might be a topic that deserves a meeting for itself.
The online meeting was concluded without an apéro, but with many new insights on the blockchain sector gained.
Marco Zgraggen, Geschäftsführer, Sisag AG, Daniel Pfiffner, Geschäftsführer, ProSim GmbH Date of presentation: 16.03.21
Die Firmen Sisag AG, Remec AG und ProSim GmbH haben einen Bergbahnsimulator entwickelt, der es ermöglicht, Alpine Destinationen wie beispielsweise Skigebiete mit ihrer Infrastruktur in kurzer Zeit digital abzubilden und Entwicklungsmöglichkeiten zu testen und auszuwerten. Dabei geht es vor allem um Kapazität, Kosten des Betriebs und Verhalten der verschiedenen Gäste im Skigebiet.
Der Simulator hat dabei zwei Anwendungsgebiete. Einerseits ist dies die strategische Entwicklung der Bergbahngebiete. Dabei kann die Frage sein, was die ideale Dimensionierung eines zu ersetzenden Liftes ist und was die Auswirkungen der Dimensionierung auf das restliche Gebiet sind. Ebenso können neue Pistenführungen oder neue Anlagen und ihre Auswirkungen im Gebiet im Voraus getestet werden. Andererseits dient der Zwilling der operativen Entscheidungsunterstützung. Beispielsweise was passiert, wenn ich heute unter einer gewissen Anzahl Gäste eine weitere Piste öffne oder einen Lift schliesse, oder wie viele Kassen muss ich öffnen, damit die Wartezeit an der Talstation nicht zu gross wird.
1. What was the Challenge?
Es gab mehrere anspruchsvolle Entwicklungsschritte. Einerseits war es sicher die Zieldefinition des Projektes. Was sind die Fragestellungen, welche die Bergbahngebiete wirklich beschäftigen. Am Anfang wurde vor allem in Richtung Kapazitätsplanung entwickelt. Im Laufe des Projektes hat man festgestellt, dass die Kostenberechnungen ein ebenso wichtiger Teil für den Nutzen der Software ist.
Ein weiterer anspruchsvoller Schritt war, das Personenverhalten von verschiedenen Personengruppen im Gebiet abzubilden. Diese konnten durch agentenbasierte Simulation gut und generisch abgebildet werden.
2. By which Service-oriented Approach did we Solve it?
Der digitale Zwilling wird für jede Alpine Destination individuell gebaut und parametrisiert. Durch die vorgängige Entwicklung einer Bibliothek für Alpine Destinationen kann dies in kürzester Zeit realisiert werden. Die Software wird dem Betreiber anschliessend auf einer Plattform zur Verfügung gestellt.
3. What are our learnings?
Dies ist eine Umsetzung eines Digitalen Zwillings. Es wurde von der Zieldefinierung, über die Umsetzung, bis hin zu den Weiterentwicklungsmöglichkeiten und nächsten Schritten berichtet.
New structure, new logo, new concept: the expert group “Machine Learning Clinic” is a unique pool of expert knowledge. Our last meeting aimed to connect experts willing to share their knowledge with companies in need of expertise to push AI-projects forward. Despite the hype around AI and deep learning of the last years, only a few deployed solutions are running in industry. Why is this? What are the missing bricks? One of the missions of the ML-Clinic is to overcome this gap between lab and real-world applications.
During registration we identified needs and experts on the following hot topics:
Hardware / Edge-Processing
During a 90min virtual meeting we connected people, exchanged experience, and brainstormed about new ideas. With the new open-innovation initiative www.databooster.ch and the support from Innosuisse there are many possibilities to support companies on their ML-journey.
In a familiar round we discussed about real cases from Roche, Sulzer, SBB and others. One common issue is data quality, availability and working with rare scenarios. How to deal with missing, wrong or corrupted data. How to train robust neural networks based on such datasets. There is no easy solution but there are more and more ideas how to deal with these common industrial issues.
Beside the deep technology discussions another highlight was the “non-virtual apéro package” which all participants received before the event. Even though we only communicated through Bytes over a glass fibre everyone had a real chilled beer and some nuts in their hands – what beer would be better to stimulate the real neurons than the AI beer: DEEPER
Overall a successful event and we hope you tune in for the next get together of the ML-Clinic!
The first Use-Case Talk of the year the took place online on the 15th of March 2021. Industry, academic and individual members of the data innovation alliance came together to discuss data-driven innovation, confidential computing, and natural language processing for analytics.
The first speaker, LucasFiévet, Co-Founder and CEO of LogicFlow AG, presented an overview of how machine learning will ease the challenges of software test maintenance, testing non-functional requirements and test diagnosis. The talk deep-dived into a use-case of auto encoders for anomaly detection in web application screenshots.
Our second speaker, Grégoire Devauchelle, Data Scientist at Elca Informatik AG, told us how ELCA developed an information extraction engine for legal documents using Azure services and a custom application. The work also focused on finding the right balance between process automation and error rate of the model.
These two interesting Use-Case talks sparked a lively and interesting discussion. In the Q&A session we exchanged ideas, challenges and information among the industry and academic experts.
This was the first online Use-Case Talk this year , however, we remain hopeful that the next ones will be take place in person at Aspaara’s venue in Technopark Zurich .
The Use-Case Talks are part of a series taking place three times a year. If you are interested in sharing your AI stories and discussing them within the community, you are warmly welcome to join us for our next Use-Case Talks taking place on 14th June 2021. If you are interested in presenting a Use-Case, please contact us by e-mail (firstname.lastname@example.org).
About the Use-Case Talks
The Use-Case Talk Series allows participants to enjoy in-depth technical discussions and exchange information about interesting technical challenges amongst experts. The Use-Case Talk Series are organized by Aspaara Algorithmic Solutions AG on behalf of data innovation alliance.
We are a start-up based in Geneva, Switzerland. We turn sounds into data, enabling cities to monitor urban noise in order to improve many aspects of city life. We combine Artificial Intelligence and the Internet of Things to hear our urban existence – this adds a missing layer to the smart city concept.
What is Securaxis background story?
Glen Meleder and Gaetan Vannay are the Co-founders of Securaxis. We both have experience working in difficult and sometimes unsafe environments. Glen Meleder is an IT engineer by training. He has worked in duty stations for an important international organization with focus on conflict resolution. Gaetan Vannay has worked extensively as a war correspondent. This is how we met. Our first project was to develop a tool to manage security and security information in an operational framework through an app on a web platform. This tool is currently used by international organizations, governments and private companies all around the world.
Since then, Securaxis has evolved into something very different.
The idea of combining AI and acoustic analysis (“to turn sound into information”) came out of a hackathon organized in 2018 by the CERN (European Center for Nuclear Research.) This made immediate sense to us. In many contexts of armed conflict you may hear a threat before you see it. Sound is very useful information!
As humans, we are used to understanding and transforming sound to guide our actions. At Securaxis we aim to transfer this ability to smart cities; hearing will add the missing layer to smart solutions. Initially, we focused on safety and security. However, discussing with authorities in different cities, we understood that the immediate traction of this concept is in the domain of road traffic monitoring; this enables traffic management, dynamic smart lighting, predictive road maintenance and real-time monitoring of the level of traffic noise.
Securaxis is also active in biodiversity monitoring. Scientists have shown that sound is a reliable indicator for monitoring ecosystem biodiversity. It provides information about species, habits, quality of life and habitat conditions. It can also help to determine the many and varied interactions between wildlife, human habitats and human built infrastructures. This solution can fulfil an important part of studies pertaining to the environmental impact of major construction projects. It is a surprise to us how far we have come from issues relating to people’s safety and security. But we are comfortable with this surprise!
Why are the projects important?
Road traffic causes 80% of noise pollution in cities. By 2050, 68% of the world population will live in cities or suburbs and road traffic will increase by more than 40%. It is well documented that exposure to excessive noise has an impact on people’s health. New tools to monitor and better understand noise in a city are needed urgently. There are also implications for privacy. People are wary of cameras that are currently the main solution for monitoring of traffic. With our system, the sounds never leave the street. There is no recording. Only specific sounds are detected and these sounds are processed at sensor level; only metadata are sent.
Who can profit from your services?
Our clients are OEMs (Original equipment manufacturers) and system integrators active in smart cities. At the end of the day, the people living in cities will profit from our approach. Cities can monitor and improve their urban environments whilst accommodating people’s concerns about privacy and sustainability.
Can you give some further examples of your success stories?
We have initiated projects all over Europe from Finland to Portugal. Because of Covid-19 we had to delay some deployments of sensors. So far, we already have installations running in Switzerland, France and the United Kingdom. We will also soon have installations in Luxembourg. These projects are going well. What we could call a success story is that the idea of monitoring road traffic and noise traffic by a combination of acoustic sensors and AI is well understood and validated as an accurate and very cost-effective solution.
How do your customers find you?
Before Covid-19 we were mainly present at European fares and congresses. Today we participate in virtual events, but we increased our footprint in these. We sponsor the event and/or organize virtual workshops and panels. We reallocated budget for travel and accommodation to upgrade our participation in virtual fairs, exhibitions and congresses as a mere presence with a stand is not enough to attract prospects in this virtual world. Our potential customers find us at these events organized around “Mobility” or “Smart Cities”. Of course, we have our website (www.securaxis.com) and we work the phone. The good news: today clients are coming to us before we reach out to them.
What are your biggest challenges?
Initially, we found that people’s thoughts about noise is first about recording or measuring decibels. We had to explain that what we are offering is very different. We had to show that cities can listen to themselves and that the technology that does this really works. The challenge today is to show that we comply GDPR (General Data Protection Regulation) and that our solution fully respects privacy. Our direct customers – OEMs and system integrators – understand our technology well but City authorities, policy makers, lobbyists etc. often need further clarification. Notably, in France we went through the process of having our solution validated by the CNIL (Commission nationale de l’informatique et des libertés), the official national body in charge of protecting personal data and preserving individual liberties.
How do you see the future of Securaxis and what is your long-term goal?
Our long-term goal is to become a reference for recognition and monitoring of sounds; especially for smart cities.
On February 4, 2021, the expert group Smart Services got together online for another successful event in the series “Service Lunch”. Dr. Nikola Pascher from Kistler Instrumente AG presented the the Kistler Innovation Lab as a powerful Digitization Booster. The talk was accompanied by lively discussions among the more than 50 participants. Please find a summary of the talk here:
Kistler is the global leader for providing modular solutions in dynamic measurement technology for pressure, force, torque and acceleration measurements. The company looks back on a continuously growing business, selling hardware and system solutions in various markets. Headquartered in Winterthur, Switzerland, and with various locations worldwide, Kistler’s next step is a digital transformation to maintain steady growth within the digital age. This involves the creation of the Kistler Innovation Lab as a powerful digitization booster.
The Innovation Lab follows the general vision “Turning data into value”. This means, that we build on the vast amounts of data created with Kistler’s sensor technology and create value by using digital methods, rooted in data science, mathematics and signal processing. Digital initiatives are pursued in a protected framework at a higher speed than possible in the general corporate context. To accomplish this, the Innovation Lab stands on three pillars: With the co-creation platform, we connect different fields of expertise, share knowledge and data and provide digital know-how. The digital technology incubator is a professional framework for quick experiments and ideas with the ultimate goal to pursue proof of concept projects for digital services and solutions based on Kistler sensor data. With the digital training center, we want to empower the Kistler team and our partners to identify digital business opportunities.
In the first part of this talk, we report on the general digital transformation mechanism at Kistler with a focus on the ramp-up of the Innovation Lab within the corporate context. Despite the challenges, which are imposed by the Covid-19 pandemic, the Innovation Lab turned out to be a powerful tool, delivering first proof of Kistler’s data-based capabilities and strengthening the credibility towards our team, customers and partners.
In the second part of the talk, we focus on technical aspects of data-based services and solutions. All initiatives build on a powerful and scalable technology stack, which allows the quick set-up and deployment of cloud-based APIs. We report on first projects within the co-creation platform and the digital technology incubator. These projects aim at the fast creation of data-based services and solutions. In a co-creation project with our in-house sensor production, we aimed at optimizing a metal machining process inside a turning lathe. Together with the Kistler-internal machine shop, we made an important step towards a predictive maintenance and quality forecasting service. In a second project, we analyzed data from our weigh in motion (WIM) systems and realized, that roughly 30% of all trucks are driving empty. With the help of a machine learning model, we can forecast the flows of empty and full trucks with high accuracy.
Can you shortly tell us what Kitrodoes and who you are?
Kitro aims to reduce food waste in the hospitality industry. We do this by analysing the food that’s being thrown away in large kitchens. Our solution is a fully automated IoT devise that consists of two parts, it’s a hardware with a scale and a camera. The scale automatically detects when something is added to the bin and this triggers the camera. The captured image is uploaded to the cloud where it is analysed. The results of the analysis are uploaded to an online dashboard where our customers have access to it 24/7. Based on this, our customers can decide how to reduce their waste. Our customer service can also help to understand the dashboard and the data in order to make better decisions.
We are currently developing our “best practices” page to serve as an inspiration for our customers. The idea of the “best practices” page is to have a collection of options on ways to reduce food waste.
What is Kitro’s background story?
The company was founded by Naomi MacKenzie and Anastasia Hofmann. They studied at the École hôtelière de Lausanne. During their education they gained work experience in kitchens and services, where they saw how much food was thrown away all the time. They wanted to tackle this issue and came up with the idea for Kitro. Kitro was founded in 2017. As neither Naomi and Anastasia had a tech background, they got out outside talent to develop the product and are themselves managing the business side.
Why is it important that Kitro exists?
Globally, along the supply chain, one third of all food is wasted. In the food and beverage industry, two thirds of all the food that is thrown away is a waste that could be avoided; it’s still edible but is still being disposed. This has a large environmental impact and is also a big cost for the restaurants. With our product we hope to help restaurants reduce food waste and consequently reduce the environmental impact and save money.
Who can profit from your product?
We mainly work with larger schools, hospitals and canteens. We also work with restaurants and hotels, but they have to be of a certain size for it to be advantageous for them to use our product. We offer a subscription and the idea is that they save more than they invest.
For now, our product is not useful for individual persons who want to reduce their food waste – but that would be a cool idea!
Can you give some examples of your success stories?
From a tech perspective we have built a huge data set with all the data that we collected. We were able to train machine learning models based on this data that helped with the analysis and made the process more efficient and reduced costs.
We have also had customers from really early on in the project; the product has already been tested with customers for three years.
A big milestone is also that we grew the team to 12 people.
How do the customers find you?
In the beginning we took part in many competitions, such as start-up competitions, which gave us visibility. This helped the customers find us. We also went to conferences. Now we have also started to approach potential customers through cold calls and the like.
What are your biggest challenges?
From a tech perspective it’s challenging to automate the processes. We want to basically reduce or remove the human from the labelling in order to reduce costs and make the company profitable.
It has also been a challenge to create a clean data set that can be used for machine learning. This is still an ongoing process – it takes a lot of work. These are the biggest challenges.
How do you see the future of Kitro and what is your long-term goal?
The data that we collect could be interesting also for the government, not only for restaurants and food services. It could be interesting for them to get a better picture of the amount and kind of food that is thrown away, and to set up some kind of rules or guidelines in the future.
Food waste is also a problem in retail and on farms, not only in commercial kitchens and that is an area where we could make an impact.
In the future we also want to support our customers more with the decision making – with our “best practices” website and with predictive measures – based on the data collected. A first version is already online, but currently it is only visible to our customers on their dashboard.
Am ersten Juli 2020 fiel der Start zur Sommerolympiade service excellence for you. Am Jahresanlass des Schweizer Service Verband SKDV wurden die Teilnehmer dazu eingeladen, sich folgender Frage zu stellen:
«Welches Unternehmen hat den Besten Kundendienst, sind sie bereit für die Herausforderung?»
Kurt Ulmann (Vize-Präsident SKDV) hatte die Idee, über den Verband die Prägung zum Thema Service Excellence in den Unternehmen zu testen und zugleich das Wissen bei den Mitarbeitenden zu vertiefen. Gemeinsam mit der Lern/Game Plattform von Quizmax und dem Know-How von der Service Manufacture entstand das Quiz service excellence for you, welches Themen wie:
Excellence @ work
Mindset für Service Excellence
Service Excellence Wissen
«Nur wer selbst brennt, kann Feuer in anderen entfachen» Augustinus Hippo
Das Game kann via APP auf dem Handy sowie auf dem Desktop gespielt werden. Die Spieler quizzen in den Unternehmen gegeneinander, oder gegen den Computer. Dabei sehen sie ihren Wissenstand und können verschiedene Wissensstufen erreichen. Ausgewertet wird am Ende der durchschnittliche Erfolg, welcher mit den anderen Unternehmen verglichen wird. Daraus entsteht eine dynamische Rangliste der mitspielenden Firmen.
Das Game ist auch ohne Olympiade jederzeit verfügbar:
Der SKDV stellt das service excellence for you quiz seinen Mitgliedern sowie den Mitgliedern der Technischen Kundendienst Kammer TKK kostenfrei zur Verfügung
Jedes Unternehmen, das seine Mitarbeiter im technischen Kundendienst weiterentwickeln möchte, kann sich für einen eigenen Zugang registrieren.
Mehrere Unternehmen rangen um den Sieg und den begehrten Titel «Bestes Serviceunternehmen 2020»
And The Winner is:
Hörmann Schweiz AG, unter der Leitung von Patrik Hostettler, Leiter Technischer Kundendienst.
Gemäss Herrn Hostettler wird das Thema Kundenbegeisterung bei Hörmann grossgeschrieben, was der Gewinn an dieser Olympiade unterstreicht. Er werde im Frühjahr einen Mitarbeiteranlass organisieren und den Gewinn, «Ambassador Kart» eine begeisterte Fahrt mit dem Elektrokart von der Firma Service Manufacture, einlösen. Dabei wird er seine Servicehelden gebührend feiern.
Could you shortly tell who Data Ahead Analytics is and what you do?
Data Ahead Analytics is a Zurich based tech company. We enable our customers to get the carbon footprints of their value chains at a very granular level. We help them calculate the impacts and seize opportunities in their journey towards decarbonization.
What is Data Ahead Analytics background story?
Christian Spindler, CEO of Data Ahead Analytics, has a passion for bringing long-term sustainability dimensions to the business side of companies.
“When you look back at how sustainability was originally defined, there was a balance between the ecological, social and economic aspects of well-being: one aspect should not compromise the other aspects. In order to ensure overall sustainability, we should bring all these three elements together, in order to have thriving businesses that are ecologically and socially sustainable.”
Christian has a background in physics and was engaging in data science, already before the word “data science” had been coined. During his PhD he started to combine climate research with data-driven modelling. He specifically worked on the statistical analysis of aerosol measurements, one of the contributors to climate change that still has many open questions. He enjoyed bringing together data science on the one side and sustainability measures on the other. Christian then developed his professional career in large corporations, such as ABB and in the large consultancies Deloitte and PwC. At the end of 2018 he decided to engage full-time in the environmental fintech start-up, Data Ahead Analytics.
The idea of Data Ahead Analytics started back in 2015 around the COP21 (Conference of the Parties to the United Nations Framework Convention on Climate Change (UNFCCC)) that took place in Paris in November 2015. The conference achieved a breakthrough in international climate negotiations and, consequently, in international climate economics. All the participating nations committed to intensifying their actions and investments to achieve a sustainable, low carbon future. Still five years later, the Paris agreement is considered very successful, because it was the very first time that all nations agreed to undertake such ambitious efforts to combat climate change. The idea of Data Ahead Analytics was developed around this momentum. We thought that when something with this magnitude happens, the entire accounting system and the way we measure sustainability impacts and sustainability related risks, will have to change – over the globe and for every company. We saw this as our opportunity to make a difference. We wanted to bring the carbon disclosure and the carbon risk measurements that were then used to a much deeper and more detailed level. This proved to be very valuable, because there are many companies that want to become carbon neutral by 2030 but lack the means to achieve it. We help them on their journey by providing the means to count in detail, where in their processes carbon is emitted and navigate them through the current uncertainty of the decarbonization ahead. And, of course, identifying business opportunities found in decarbonization of a company is very exiting!
Now Data Ahead Analytics is run by three co-founders working with four students. We are also looking for more software developers.
Why is it important that Data Ahead Analytics exists?
It is important because on the long term we will be forced to build up the same kinds of accounting mechanisms for carbon neutralization that we currently have in the financial world, for example the IFRS standards (International Financial Reporting Standards). In the future, we will also have to take other externalized impacts into account, such as biodiversity degradation, waste, or air-pollution. But for now, we focus on carbon. This means that we hope to see a world where we count each and every molecule of carbon in the same way we count every dollar and cent in the financial books of organizations. We are on the forefront in shaping holistic carbon accounting. We also work on incorporating all the mechanisms available in the financial industry, for example risk analysis, into the world of sustainability.
Who can profit from your services?
We have two major customer segments that profit from our services: financial services and non-financial companies. On the financial side, we are screening climate risks for both real-estate investors, and for corporate loan books of banks. On the non-financial side, we provide dynamic product carbon footprints that go deep down to a single process level and analyse granular emissions at each step in a certain process path, identify decarbonization potential and map emissions to future financial costs from a sustainability perspective. Our customers are individual businesses with all the services and products they offer.
It is exciting to see the different fields enforcing each other: pressure from the financial sector on corporates forces the companies to evaluate their sustainability processes. Then the large companies will push this need up their value chains, and this way sustainability accounting expands to smaller companies, who may not necessarily have as much knowledge of or experience in ESG.
How do your customers find you?
Through feedback from our previous or current customers. Our customers specifically appreciate the collaborative philosophy of our work. And the field of sustainability analytics is still young and not very standardized, so many customers need upskilling and are happy if we can help bringing them to the next level in this exciting field.
Can you give some concrete examples?
An example from the finance field is a real-estate investment firm, who are looking into sustainability risks in objects on the market. We are developing an API (Application Programming Interface) based solution, where few basic input data points of properties are sufficient for a thorough analysis of the sustainability risks, particularly climate risks, of a specific asset.
On the corporate side we recently concluded a very innovative project with the multinational company Siemens in Germany, where we went down to the granular production processes of a single product. We designed and implemented carbon foot printing for each individual product that run through the line. We are currently industrializing the solution to deliver continuous, automated, and verifiable product carbon footprints. This demonstrates that we can go to new levels of depths with our technology.
What are your biggest challenges?
Our biggest challenge is that we have more possibilities to add value than we can thoroughly deliver. This is why it is especially important to focus on the relevant issues that we know we can deliver well. But this is not always easy, with all the different areas that simultaneously move forward fast today. We are unfortunately often forced to say “No” in order to keep on the relevant track in serving our customers.
How do you see the future of Data Ahead Analytics and what is your long-term goal?
Our vision is to run companies with a fully developed sustainability accounting system – catering the same kind of regulations and following the same kind of standardization that we currently have in financial accounting. Today we are in a situation where we can build sustainability accounting from the green field software development – doing things the right way digitally, end to end.
For businesses we are developing a Software as a Service product that allows companies from the financial and non-financial sector to acquire ESG data and run impact, risk, and opportunity analyses.
We want to keep being an innovative leader in this young field.
Is there anything else that you want to add?
We are looking out for software developers, please feel free to drop Christian a note if you are interested to learn more!
Since December 2018, the Expert Group for Data Ethics within the Swiss Alliance for Data-Intensive Services has been working hard to create a comprehensive Code of Ethics for data-based businesses. Now the long work has borne fruit and on 10. December 2020 the Code of Ethics was published and presented at an online press conference. The Code is a tangible contribution from our Alliance to help companies make data-based business compatible with the core values of our society, and in avoiding damage to reputation. It is a contribution to ensure the social acceptance and sustainability of data-based value creation in our country.
In recent years the availability of data has increased enormously which has led to a profound economic change. In many sectors, SMEs and large companies are considering how to develop new products and services by collecting, buying, storing and analysing data. This process of data-based value creation is accompanied by regular reports on unethical behaviour such as the violation of customer privacy, the use of “unfair” algorithms or unintended social effects of new data services. These incidents undermine consumer confidence and are partly responsible for the fact that even carefully designed apps such as the SwissCovid app are met with scepticism by quite a few people.
Trust in data-based value creation does not only depend on compliance with data protection law. In contrast to various comparable documents, the code addresses concrete problems of data management in a practice-oriented manner. It not only explains the basic ethical issues, but also provides concrete recommendations. Furthermore, it addresses the question of how the code of ethics can be integrated into the concrete business processes of companies. The code is therefore a comprehensive guideline for responsible action. The code is intended to help companies and institutions to boost the confidence of consumers and politicians in the value-added use of data.
The code of ethics comprehensively addresses all relevant issues related to the data lifecycle and aims to help both SMEs and large companies to achieve ethical data value creation.
It consists of the following documents: 1) Overview 2) Basics 3) Recommendations 4) Implementation 5) Context.
The code was developed by experts from the Swiss Alliance for Data-Intensive Services. In particular, representatives of the “Digital Ethics Lab” of the Digital Society Initiative of the University of Zurich, technical experts from Zurich and Valais Universities of Applied Sciences and representatives of companies were involved.
The Code of Ethics is available on the Alliance’s website in German, English, French and Italian.