Services

Big Data infrastructure

We define Big Data architecture by creating design for customer's specific needs. 

Data processing

We will design data processing system, its examination and implement the analysis in the field of machine learning. Subsequently, we will organise a training focused on the utilisation of these instruments. 

Data analysis

Big Data analysis is based on the examination of data provided by the customer, which will be supplemented by other sources. We will find correlations and relations between phenomenon and merits.


 

Big Data infrastructure

Data processing, storage and analysis require adequate infrastructure. The key characteristic is distributed architecture and linear scalability. Batch-oriented systems such as Hadoop or NoSQL databases have their own specifications and they differ from the real-time data processing, where Apache Kafka or Apache Flink are the most wide-spread. We are proficient at both.

We provide these services

  • Big data systems architecture
  • Strategy definition
  • Architecture design
  • Health check
  • Installation of hadoop cluster or real-time cluster
  • Maintenance and support
  • Specialised trainings
 

Data processing

Infrastructure itself is, however, insufficient for creating value. The data need to be also processed. The existing sources and systems produce data in their own specific formats. They differ in type, character or quality. Generally, they are not suitable for storage and use in their raw state. That is why data processing is our most frequent task. We like doing it, because we know how. 

We provide these services

  • Data processing design
  • Data processing examination
  • Instrument configuration for data processing
  • Contract development/Design by contract
  • Data processing for machine learning
  • Instrument training

Machine learning

Machine learning represents a field of artificial intelligence studying methods of programming, whose goal is to acquire the ability to learn. By means of statistics, it tries to understand circumstances, which generate data – often in order to test various hypotheses. It tries to find patterns in this data that are understandable to people.

This is used in practice, for example, in customer segmentation in marketing campaigns and the production of individualised offers. 

 

Data analysis

Currently, acquiring value from data has the biggest potential for the companies' growth, optimalization of their expanses or business models. Having the right information at the right time is the holy grail of entrepreneurship. It cannot be achieved any other way than by data analysis.

We explore, for example, how the outside temperature influences energy consumption or how it affects sales. What customer segments exist and which products have the biggest chance to succeed in a campaign. On the other hand, industrial customers are more interested in information about the effectiveness of the production, identification of factors influencing the quality or the prediction of accidents.

We provide these services

  • Analysis of customer's data
  • Supplementing data with other sources
  • Search for correlation
  • Interpreting results for the customer's business needs

Predictive analysis

Predictive analysis is a tool of highly sophisticated analytical methods aimed at finding correlations among various types of data (often seemingly unrelated). It is an evolving field of data analysis with growing number of algorithms. Essentially, it is oriented on the search for characteristics and models of behaviour. Many algorithms reveal correlations among data, based on which it is possible to determine the cause.

It is widely used in sales, when based on the customer's browsing habits, it is possible to predict a certain type of behaviour, preferences and risk rate.