We face the challenge of Data Science as a Service in all its facets.
The setup of an internal data lake is the foundation for successful data journeys. Often, we can already significantly increase subsequent gain at this point by integrating external data sources.
A detailed analysis of the underlying data and problem determines the choice of the most appropriate method. We train and evaluate the models extremely critically to achieve a robust practical solution.
The usability of the findings often fails due to a meaningful and simple integration into the existing process ecosystem. Therefore, we focus not only on the technical integration but also on the comprehensible visualization of data in web applications.
Internal data value. Business processes constantly generate data that lies unused and distributed across a variety of different systems. The transfer of this data into a usable infrastructure is essential. Our experience helps finding the right trade-off between financial viability and data volume.
External sources. The use of external data is often neglected because the effort of evaluation and matching seems high. We are experts in web scraping and oversee the collection of Media Summary's daily, nationwide ad data. Our scraping results are enriched using AI and matched just-in-time with the Media Summary database.
Data Analysis. Our team of interdisciplinary Data Scientists analyzes problems in terms of potential Machine Learning solutions. In order to find the optimal approach, we invest considerable time in tracking the current state of research. This allows us to constantly add innovative tools to our infrastructure and offer flexible solution strategies.
Training und Evaluation. The development of AI systems has parallels to competitive sports: Without hard work, potential remains untapped. We love competition and our team spurs each other on to top performance. We place particular emphasis on practical solutions which is why we evaluate our models extremely critically.
Simple visualization. Simplistic visualization Digitalization makes data spaces and interrelationships appear more and more complex. We understand perfectly how to visualize them in a comprehensible way. For this purpose, we develop modern, intuitive web applications. One of our data journeys leads to the Job Cube, which provides insight into the job market.
Seamless integration. AI systems can completely replace manual efforts. However, they often also support decision-making. Our goal is therefore always seamless integration into business processes. This increases both acceptance and value creation and determines the success of the data journey.
Follow our current projects and get insight into real-world use cases.
As part of a recent project, we were faced with the task of compiling contacts for specific competences within public administrations across Germany. To avoid having to work through this task manually, we scraped the municipality websites and trained a NER model that extracts contact details for a number of departments directly from their official internet presence. The result is a comprehensive list of over 10.000 fully qualified contacts.
As part of his internship, Maurice took on the task of automatically determining information about the provider of a website. For this purpose, we trained a NER (Named-entity recognition) model which extracts all relevant data from the site's impress. You can try the result here.
We attach great importance to the proximity to excellent research locations and a correspondingly dynamic environment. The Foxy Bytes Data Science Unit is therefore located in Munich and consists mainly of bioinformatics graduates. We have never forgotten our roots: The headquarters in the Rhine-Main area is home to the Foxy Bytes front-end experts. Successful collaboration across distributed locations is therefore an integral part of our open corporate culture.