1. Please briefly introduce yourself. What is your background, how did you come to AI?
My name is Jonas Rietsch, I first studied physics in Erlangen and had my first contact with machine learning in my master's degree. In my master thesis I worked on the classification of sleep phases, a form of time series classification. At that time, I was already interested in Natural Language Processing and thus came across adigi during my job search. I have been employed there as an ML Engineer for 3 years.
Download Zip https://shoxet.com/2yK5rt
2. Which company and product / service are we talking about specifically?
Adigi is a B2B service provider for travel agencies. They forward requests to us, which are automatically processed using AI. In this way, travel consultants can be relieved.
5. What algorithms / type of AI do you use?
We exclusively use neural networks. More precisely, pre-trained BERT-like transformers, which we further train unsupervised. In addition, we also use smaller networks that are themselves supervised trained "from scratch".
6. What added value does the AI provide for the user?
Our solution enables travel agencies to process typical requests, such as for package tours, very quickly and to generate very specific offers. As a result, a higher booking rate can be achieved overall than with purely manual processing.
7. Could you have solved the problem with a traditional algorithm without AI? If no: Why was an AI necessary?
No. The data is very unstructured, the "needs" have to be extracted from the text queries, and for example period number of people and preferences. Spelling errors can also occur and, for example, the place or hotel names would be particular hurdles. So a rule-based approach would not be possible because of the unstructured nature.
8. What hurdles were there in implementing the AI and how did you overcome them?
There are and were several hurdles. For one, detection alone is not enough; it is also important for us to be able to assess the reliability of the AI prediction. This allows us to decide whether a human needs to check the result. The pure model output (e.g. softmax values) is not enough for this. In addition, entities also need to be normalized. Errors can happen during this process, which have to be quantified. In general, the quality measurement of the different steps is not trivial. Another problem is the versioning of models and datasets, since the typical Git version management only works partially. Also, Test-Driven-Development methods cannot be easily transferred to Machine Learning projects. We are actively working on solving such problems, for example using hard-coded tests for validation and fuzzy matching with a list of possible values. If scores are not in the green zone, humans intervene.
9. Where does the data you use for training come from?
The queries we receive generate enough data for supervised learning. In combination, the pre-trained models are trained on large amounts of unstructured text.
12. What are their next steps? For example, is the model re-trained on a regular basis?
Re-training happens irregularly, for example, when there is a change to the data, its preparation or the model. Since we mainly fine-tune here, the costs for this are kept within limits. We are constantly developing our models, evaluating alternatives, and working to solve the problems mentioned above, among other things.
13. What would have supported you in your intention to use AI? E.g., advanced training, GPU computing power, memory....
The ability to pre-train your own language model. Because of domain-specific texts, such pretraining would be helpful. However, this requires a lot of computing power. Advanced training would definitely be interesting for us, especially with practical content like on deployment in the cloud. For experienced software engineers, beginner-friendly training courses on AI would also be useful.
1. Please briefly introduce yourself. What is your background, how did you come to AI?
I first studied mechatronics in the bachelor's program and then information technology in the master's program. In my studies, I already had contact with AI and data science through lectures. At work, I have had a lot to do with cloud and AI through the project management of digitization projects. I recently became Head of Artificial Intelligence at Krones AG.
2. What company and product/service are you specifically talking about?
Krones uses AI at various points in the value chain - both for internal optimizations and for our products. A concrete example is our Linatronic AI, an inspection unit that has been able to significantly reduce the false rejection rate through Deep Learning.
3. What is the importance of AI for this?
The importance of AI for Krones is growing steadily. It is an important building block of the digital transformation. The importance of AI for Krones is growing steadily. It is an important building block of the digital transformation.
4. What algorithms / type of AI do you use?
We use the full range of AI methods. Both symbolic, knowledge-based AI, as well as machine learning and deep learning are used, depending on the use case. We not only used supervised and unsupervised learning, but also reinforcement learning (RL).
For example, smart maintenance strategies allow our customers to maintain more efficiently themselves or to purchase support from Krones. Our goal is to maximize line output, reduce scrap, and shorten unplanned downtime. Here, AI also helps us keep production quality high, directly influencing output.
6. Could you have solved the problem with a traditional algorithm without AI? If no: Why was AI necessary?
Some applications could not be solved without AI. One example is scaled control processes that have too many process parameters for normal controllers. We addressed this with RL and initial field test experiences were successful.
In other cases, there were traditional solutions, but they have been significantly outperformed by AI and therefore superseded. These include visual quality inspection, which has been significantly improved by Deep Learning algorithms.
These requirements influence, for example, the decision where to deploy the model, which model type to choose, and how to optimize the model, since all processes on a target system compete for the same resources.
This must always be reweighed depending on use case to use case.
With RL, there was also the fact that, unlike trivial use in video games for example, one cannot make an arbitrary number of runs to learn the correct behavior. Especially not in a production system. To do agent training, we had to create simulation environments and digital twins. Machine learning was also used for this purpose.
9. Did your software developers have prior experience in using AI?
It varies a lot among us in the team. Some colleagues already had prior experience from their job, studies or doctorate. In some cases, however, they have also taught themselves the content. We place a lot of emphasis on training and continuing education.
Issues such as data drift and concept drift need to be addressed preemptively and measures for this need to be thought of at the design stage.
Therefore, we have designed monitoring & operations processes from the beginning, which allow us to monitor the quality and performance of deployed models.
12. What would have supported you in your plan to use AI? E.g., advanced training, GPU computing power, storage space....
Computing power and memory have already been cleared in our case and therefore would not have been necessary. However, since AI is very fast-moving, it is important to keep up with the times. That's why we are always open for networking and exchange, for example on best practices, research approaches, application areas and practical experiences.
My name is Reinis Vicups and I am a co-founder and CTO at the Technological Institute for Applied Artificial Intelligence (TIKI). During my studies at TU Riga in Latvia, I worked for Siemens. There I had worked early with predecessors of Machine Learning (ML), e.g. Petri nets and dealt a lot with automation. Through assignments abroad, I ended up in Nuremberg and have thus been in Germany for 20 years. I moved from Siemens to Samhammer AG and worked there as a developer, architect and project manager. Around 2013 a new project came up at Samhammer on text analysis and clustering, which is how I got into ML. After a couple of years, most of the internal use cases were successfully implemented. However, we didn't just stop doing ML. Funded research projects were then the origin for the spin-off of TIKI from Samhammer AG. One goal was to share AI research results with Bavarian SMEs.
And I am Timo Walter, Machine Learning Engineer at TIKI. I first studied electrical engineering & information technology at the OTH Amberg-Weiden as part of a dual study program with BHS Corrugated Maschinen- und Anlagenbau GmbH. There I took a liking to software development and therefore I did a Master in Computer Science, again dual, in Regensburg. Afterwards, however, I wanted to do "more" than just software development and to bring about real change. I could immediately identify with the vision of TIKI and have been here for almost 4 years now.
2. What is the company about?
TIKI was founded in 2017 and the shareholders are Samhammer AG, Krones AG and Zollner AG. One of the founding impulses also came from the Bavarian Ministry of Economics and the University of Bayreuth. The TIKI business model is to build productive AI applications for our shareholders and selected customers as well as to integrate them into their respective productive environments. In order to solve this task effectively and sustainably, TIKI has built its own AI development environment in the form of the Data Science Platform (DSP) and operates it on its own infrastructure.
In the next step of TIKI development, we will make our AI processes & expertise available to the open market, in the form of joint projects with third-party customers.
Due to our success in building productive AI applications, we currently have a strong demand from some large Bavarian companies who would like to become shareholders in TIKI.