Data Analysis Training Courses

Data Analysis Training Courses

Local, instructor-led live Data Analysis (Analysis of Data or Data Analytics) training courses demonstrate through discussion and hands-on practice the programming languages and methodologies used to perform Data Analysis.

Data Analysis training is available as "onsite live training" or "remote live training". Onsite live Data Analysis training can be carried out locally on customer premises in Jordan or in NobleProg corporate training centers in Jordan. Remote live training is carried out by way of an interactive, remote desktop.

NobleProg -- Your Local Training Provider

Testimonials

★★★★★
★★★★★

Data Analysis Course Outlines

Title
Duration
Overview
Title
Duration
Overview
14 hours
Overview
There are plenty of tried and tested patterns widely available to everyone. Sometimes it is a matter of changing the names and implementing the pattern in a specific technology. It can save hundreds of hours, which otherwise would be spent on design and testing. Training Goals This course has two goals: first, it allows you to reuse widely-known patterns, second, it allows you to create and reuse patterns specific to your organization. It helps you to estimate how patterns can reduce costs, systematize the design process and generate a code framework based on your patterns. Audience Software designers, business analysts, project managers, programmers and developers as well as operational managers and software division managers. Course Style The course focuses on use cases and their relationship with a specific pattern. Most of the examples are explained in UML and in simple Java examples (the language can change if the course is booked as a closed course). It guides you through the sources of the patterns as well as showing you how to catalogue and describe patterns which can be reused across your organization.
21 hours
Overview
Microsoft Power BI is a free Software as a Service (SaaS) suite for analyzing data and sharing insights. Power BI dashboards provide a 360-degree view of the most important metrics in one place, updated in real time, and available on all of their devices.

In this instructor-led, live training, participants will learn how to use Microsoft Power BI to analyze and visualize data using a series of sample data sets.

By the end of this training, participants will be able to:

- Create visually compelling dashboards that provide valuable insights into data.
- Obtain and integrate data from multiple data sources.
- Build and share visualizations with team members.
- Adjust data with Power BI Desktop.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Note

- To request a customized training for this course, please contact us to arrange.
14 hours
Overview
Tableau is a business intelligence and data visualization tool. Python is a widely used programming language which provides support for a wide variety of statistical and machine learning techniques. Tableau's data visualization power and Python's machine learning capabilities, when combined, help developers rapidly build advanced data analytics applications for various business use cases.

In this instructor-led, live training, participants will learn how to combine Tableau and Python to carry out advanced analytics. Integration of Tableau and Python will be done via the TabPy API.

By the end of this training, participants will be able to:

- Integrate Tableau and Python using TabPy API
- Use the integration of Tableau and Python to analyze complex business scenarios with few lines of Python code

Audience

- Developers
- Data scientists

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
A geographic information system (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present spatial or geographic data. The acronym GIS is sometimes used for geographic information science (GIScience) to refer to the academic discipline that studies geographic information systems and is a large domain within the broader academic discipline of geoinformatics.

QGIS functions as geographic information system (GIS) software, allowing users to analyze and edit spatial information, in addition to composing and exporting graphical maps. QGIS supports both raster and vector layers; vector data is stored as either point, line, or polygon features. Multiple formats of raster images are supported, and the software can georeference images. To summarize it allows the users to Create, edit, visualise, analyse and publish geospatial information on Windows, Mac, Linux, BSD.

This program, in its first phase, introduces the QGIS interface for general usage. In the second phase, we introduce PyQGIS - the python libraries of QGIS that allows the integration of GIS functionalities in your python code or your python application, so that you may even create your own Python Plugin around a particular GIS functionality.
21 hours
Overview
A geographic information system (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present spatial or geographic data. The acronym GIS is sometimes used for geographic information science (GIScience) to refer to the academic discipline that studies geographic information systems and is a large domain within the broader academic discipline of geoinformatics.

The use of Python with GIS has substantially increased over the last two decades, particularly with the introduction of Python 2.0 series in 2000, which included many new programming features that made the language much easier to deploy. Since that time, Python has not only been utilized within commercial GIS such as products by Esri but also open source platforms, including as part of QGIS and GRASS. In fact, Python today is by far the most widely used language by GIS users and programmers.

This program covers the usage of Python and its advance libraries like geopandas, pysal, bokeh and osmnx to implement your own GIS features. The program also covers introductory modules around ArcGIS API, and QGIS toolboox.
14 hours
Overview
AI is a collection of technologies for building intelligent systems capable of understanding data and the activities surrounding the data to make "intelligent decisions". For Telecom providers, building applications and services that make use of AI could open the door for improved operations and servicing in areas such as maintenance and network optimization.

In this course we examine the various technologies that make up AI and the skill sets required to put them to use. Throughout the course, we examine AI's specific applications within the Telecom industry.

Audience

- Network engineers
- Network operations personnel
- Telecom technical managers

Format of the course

- Part lecture, part discussion, hands-on exercises
21 hours
Overview
Apache Drill is a schema-free, distributed, in-memory columnar SQL query engine for Hadoop, NoSQL and other Cloud and file storage systems. The power of Apache Drill lies in its ability to join data from multiple data stores using a single query. Apache Drill supports numerous NoSQL databases and file systems, including HBase, MongoDB, MapR-DB, HDFS, MapR-FS, Amazon S3, Azure Blob Storage, Google Cloud Storage, Swift, NAS and local files. Apache Drill is the open source version of Google's Dremel system which is available as an infrastructure service called Google BigQuery.

In this instructor-led, live training, participants will learn the fundamentals of Apache Drill, then leverage the power and convenience of SQL to interactively query big data across multiple data sources, without writing code. Participants will also learn how to optimize their Drill queries for distributed SQL execution.

By the end of this training, participants will be able to:

- Perform "self-service" exploration on structured and semi-structured data on Hadoop
- Query known as well as unknown data using SQL queries
- Understand how Apache Drills receives and executes queries
- Write SQL queries to analyze different types of data, including structured data in Hive, semi-structured data in HBase or MapR-DB tables, and data saved in files such as Parquet and JSON.
- Use Apache Drill to perform on-the-fly schema discovery, bypassing the need for complex ETL and schema operations
- Integrate Apache Drill with BI (Business Intelligence) tools such as Tableau, Qlikview, MicroStrategy and Excel

Audience

- Data analysts
- Data scientists
- SQL programmers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
35 hours
Overview
Advances in technologies and the increasing amount of information are transforming how law enforcement is conducted. The challenges that Big Data pose are nearly as daunting as Big Data's promise. Storing data efficiently is one of these challenges; effectively analyzing it is another.

In this instructor-led, live training, participants will learn the mindset with which to approach Big Data technologies, assess their impact on existing processes and policies, and implement these technologies for the purpose of identifying criminal activity and preventing crime. Case studies from law enforcement organizations around the world will be examined to gain insights on their adoption approaches, challenges and results.

By the end of this training, participants will be able to:

- Combine Big Data technology with traditional data gathering processes to piece together a story during an investigation
- Implement industrial big data storage and processing solutions for data analysis
- Prepare a proposal for the adoption of the most adequate tools and processes for enabling a data-driven approach to criminal investigation

Audience

- Law Enforcement specialists with a technical background

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
28 hours
Overview
R is a popular programming language in the financial industry. It is used in financial applications ranging from core trading programs to risk management systems.

In this instructor-led, live training, participants will learn how to use R to develop practical applications for solving a number of specific finance related problems.

By the end of this training, participants will be able to:

- Understand the fundamentals of the R programming language
- Select and utilize R packages and techniques to organize, visualize, and analyze financial data from various sources (CSV, Excel, databases, web, etc.)
- Build applications that solve problems related to asset allocation, risk analysis, investment performance and more
- Troubleshoot, integrate deploy and optimize an R application

Audience

- Developers
- Analysts
- Quants

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Note

- This training aims to provide solutions for some of the principle problems faced by finance professionals. However, if you have a particular topic, tool or technique that you wish to append or elaborate further on, please please contact us to arrange.
7 hours
Overview
Highcharts is an open-source JavaScript library for creating interactive graphical charts on the Web. It is commonly used to represent data in a more user-readable and interactive fashion.

In this instructor-led, live training, participants will learn how to create high-quality data visualizations for web applications using Highcharts.

By the end of this training, participants will be able to:

- Set up interactive charts on the Web using only HTML and JavaScript
- Represent large datasets in visually interesting and interactive ways
- Export charts to JPEG, PNG, SVG, or PDF
- Integrate Highcharts with jQuery Mobile for cross-platform compatibility

Audience

- Developers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
7 hours
Overview
D3.js (or D3 for Data-Driven Documents) is a JavaScript library that uses SVG, HTML5, and CSS for producing dynamic, interactive data visualizations in web browsers.

In this instructor-led, live training, participants will learn how to create web-based data-driven visualizations that run on multiple devices responsively.

By the end of this training, participants will be able to:

- Use D3 to create interactive graphics, information dashboards, infographics and maps.
- Control HTML with jQuery-like selections.
- Transform the DOM by selecting elements and joining to data.
- Export SVG for use in print publications.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
28 hours
Overview
Microsoft Power BI is a free Software as a Service (SaaS) suite for analyzing data and sharing insights. Power BI dashboards provide a 360-degree view of the most important metrics in one place, updated in real time, and available on all of their devices.

In this instructor-led, live training, participants will learn how to use Power BI to develop custom software solutions for the Power BI and Azure platforms.

By the end of this training, participants will be able to:

- Configure real-time dashboards.
- Create custom visualizations.
- Integrate rich analytics into existing applications.
- Embed interactive reports and visuals into existing applications.
- Access data from within an application.
- Master Power BI Portal, Desktop, Embedded and Rest API.
- Integrate R Scripts into Power BI Desktop.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Note

- To request a customized training for this course, please contact us to arrange.
14 hours
Overview
Prescriptive analytics is a branch of business analytics, together with descriptive and predictive analytics. It uses predictive models to suggest actions to take for optimal outcomes, relying on optimization and rules-based techniques as a basis for decision making.

In this instructor-led, live training, participants will learn how to use Matlab to carry out prescriptive analytics on a set of sample data.

By the end of this training, participants will be able to:

- Understand the key concepts and frameworks used in prescriptive analytics
- Use MATLAB and its toolboxes to acquire, clean and explore data
- Use rules-based techniques including inference engines, scorecards, and decision trees to make decisions based on different business scenarios
- Use Monte Carlo simulation to analyze uncertainties and ensure sound decision making
- Deploy predictive and prescriptive models to enterprise systems

Audience

- Business analysts
- Operations planners
- Functional managers
- BI (Business Intelligence) team members

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
Dremio is an open-source "self-service data platform" that accelerates the querying of different types of data sources. Dremio integrates with relational databases, Apache Hadoop, MongoDB, Amazon S3, ElasticSearch, and other data sources. It supports SQL and provides a web UI for building queries.

In this instructor-led, live training, participants will learn how to install, configure and use Dremio as a unifying layer for data analysis tools and the underlying data repositories.

By the end of this training, participants will be able to:

- Install and configure Dremio
- Execute queries against multiple data sources, regardless of location, size, or structure
- Integrate Dremio with BI and data sources such as Tableau and Elasticsearch

Audience

- Data scientists
- Business analysts
- Data engineers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Notes

- To request a customized training for this course, please contact us to arrange.
7 hours
Overview
The Tidyverse is a collection of versatile R packages for cleaning, processing, modeling, and visualizing data. Some of the packages included are: ggplot2, dplyr, tidyr, readr, purrr, and tibble.

In this instructor-led, live training, participants will learn how to manipulate and visualize data using the tools included in the Tidyverse.

By the end of this training, participants will be able to:

- Perform data analysis and create appealing visualizations
- Draw useful conclusions from various datasets of sample data
- Filter, sort and summarize data to answer exploratory questions
- Turn processed data into informative line plots, bar plots, histograms
- Import and filter data from diverse data sources, including Excel, CSV, and SPSS files

Audience

- Beginners to the R language
- Beginners to data analysis and data visualization

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
Data science is the application of statistical analysis, machine learning, data visualization and programming for the purpose of understanding and interpreting real-world data. F# is a well suited programming language for data science as it combines efficient execution, REPL-scripting, powerful libraries and scalable data integration.

In this instructor-led, live training, participants will learn how to use F# to solve a series of real-world data science problems.

By the end of this training, participants will be able to:

- Use F#'s integrated data science packages
- Use F# to interoperate with other languages and platforms, including Excel, R, Matlab, and Python
- Use the Deedle package to solve time series problems
- Carry out advanced analysis with minimal lines of production-quality code
- Understand how functional programming is a natural fit for scientific and big data computations
- Access and visualize data with F#
- Apply F# for machine learning

Explore solutions for problems in domains such as business intelligence and social gaming

Audience

- Developers
- Data scientists

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
In this instructor-led, live training, participants will learn the skills, strategies, tools and approaches for visualizing and reporting data for different audiences. Case studies are also analyzed and discussed to exemplify how data visualization solutions are being applied in the real world to derive meaning out of data and answer crucial questions.

By the end of this training, participants will be able to:

- Write reports with captivating titles, subtitles, and annotations using the most suitable highlighting, alignment, and color schemes for readability and user friendliness.
- Design charts that fit the audience's information needs and interests.
- Choose the best chart types for a given dataset (beyond pie charts and bar charts.)
- Identify and analyze the most valuable and relevant data quickly and efficiently.
- Select the best file formats to include in reports (graphs, infographics, references, GIFs, etc.)
- Create effective layouts for displaying time series data, part-to-whole relationships, geographic patterns, and nested data.
- Use effective color-coding to display qualitative and text-based data such as sentiment analysis, timelines, calendars, and diagrams.
- Apply the most suitable tools for the job (Excel, R, Tableau, mapping programs, etc.)
- Prepare datasets for visualization.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
kdb+ is an in-memory, column-oriented database and q is its built-in, interpreted vector-based language. In kdb+, tables are columns of vectors and q is used to perform operations on the table data as if it was a list. kdb+ and q are commonly used in high frequency trading and are popular with the major financial institutions, including Goldman Sachs, Morgan Stanley, Merrill Lynch, JP Morgan, etc.

In this instructor-led, live training, participants will learn how to create a time series data application using kdb+ and q.

By the end of this training, participants will be able to:

- Understand the difference between a row-oriented database and a column-oriented database
- Select data, write scripts and create functions to carry out advanced analytics
- Analyze time series data such as stock and commodity exchange data
- Use kdb+'s in-memory capabilities to store, analyze, process and retrieve large data sets at high speed
- Think of functions and data at a higher level than the standard function(arguments) approach common in non-vector languages
- Explore other time-sensitive applications for kdb+, including energy trading, telecommunications, sensor data, log data, and machine and network usage monitoring

Audience

- Developers
- Database engineers
- Data scientists
- Data analysts

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
Embedding Projector is an open-source web application for visualizing the data used to train machine learning systems. Created by Google, it is part of TensorFlow.

This instructor-led, live training introduces the concepts behind Embedding Projector and walks participants through the setup of a demo project.

By the end of this training, participants will be able to:

- Explore how data is being interpreted by machine learning models
- Navigate through 3D and 2D views of data to understand how a machine learning algorithm interprets it
- Understand the concepts behind Embeddings and their role in representing mathematical vectors for images, words and numerals.
- Explore the properties of a specific embedding to understand the behavior of a model
- Apply Embedding Project to real-world use cases such building a song recommendation system for music lovers

Audience

- Developers
- Data scientists

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
deck.gl is an open-source, WebGL-powered library for exploring and visualizing data assets at scale. Created by Uber, it is especially useful for gaining insights from geospatial data sources, such as data on maps.

This instructor-led, live training introduces the concepts and functionality behind deck.gl and walks participants through the set up of a demonstration project.

By the end of this training, participants will be able to:

- Take data from very large collections and turn it into compelling visual representations
- Visualize data collected from transportation and journey-related use cases, such as pick-up and drop-off experiences, network traffic, etc.
- Apply layering techniques to geospatial data to depict changes in data over time
- Integrate deck.gl with React (for Reactive programming) and Mapbox GL (for visualizations on Mapbox based maps).
- Understand and explore other use cases for deck.gl, including visualizing points collected from a 3D indoor scan, visualizing machine learning models in order to optimize their algorithms, etc.

Audience

- Developers
- Data scientists

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
Datameer is a business intelligence and analytics platform built on Hadoop. It allows end-users to access, explore and correlate large-scale, structured, semi-structured and unstructured data in an easy-to-use fashion.

In this instructor-led, live training, participants will learn how to use Datameer to overcome Hadoop's steep learning curve as they step through the setup and analysis of a series of big data sources.

By the end of this training, participants will be able to:

- Create, curate, and interactively explore an enterprise data lake
- Access business intelligence data warehouses, transactional databases and other analytic stores
- Use a spreadsheet user-interface to design end-to-end data processing pipelines
- Access pre-built functions to explore complex data relationships
- Use drag-and-drop wizards to visualize data and create dashboards
- Use tables, charts, graphs, and maps to analyze query results

Audience

- Data analysts

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
Apache Zeppelin is a web-based notebook for capturing, exploring, visualizing and sharing Hadoop and Spark based data.

This instructor-led, live training introduces the concepts behind interactive data analytics and walks participants through the deployment and usage of Zeppelin in a single-user or multi-user environment.

By the end of this training, participants will be able to:

- Install and configure Zeppelin
- Develop, organize, execute and share data in a browser-based interface
- Visualize results without referring to the command line or cluster details
- Execute and collaborate on long workflows
- Work with any of a number of plug-in language/data-processing-backends, such as Scala (with Apache Spark), Python (with Apache Spark), Spark SQL, JDBC, Markdown and Shell.
- Integrate Zeppelin with Spark, Flink and Map Reduce
- Secure multi-user instances of Zeppelin with Apache Shiro

Audience

- Data engineers
- Data analysts
- Data scientists
- Software developers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
Magellan is an open-source distributed execution engine for geospatial analytics on big data. Implemented on top of Apache Spark, it extends Spark SQL and provides a relational abstraction for geospatial analytics.

This instructor-led, live training introduces the concepts and approaches for implementing geospacial analytics and walks participants through the creation of a predictive analysis application using Magellan on Spark.

By the end of this training, participants will be able to:

- Efficiently query, parse and join geospatial datasets at scale
- Implement geospatial data in business intelligence and predictive analytics applications
- Use spatial context to extend the capabilities of mobile devices, sensors, logs, and wearables

Audience

- Application developers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
7 hours
Overview
Dashbuilder is an open-source web application for visually creating business dashboards and reports.

In this instructor-led, live training, participants will learn set up, configure, integrate and deploy Dashbuilder.

By the end of this training, participants will be able to:

- Extract data from heterogeneous sources such as JDBC databases and text files
- Use connectors to connect to third-party systems and platforms such as jBPM
- Configure roles, permissions and access controls for users
- Deploy Dashbuilder to a live production environment

Audience

- Developers
- IT and system architects

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
Dashbuilder is an open-source web application for visually creating business dashboards and reports.

In this instructor-led, live training, participants will learn how to create business dashboards and reports using Dashbuilder.

By the end of this training, participants will be able to:

- Visual configure and personalize dashboards using drag-and-drop
- Create different types of visualizations using charting libraries
- Define interactive report tables
- Create and edit inline KPIs (Key Performance Indicators)
- Customize the look and feel of metric displayers

Audience

- Managers
- Analysts

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
14 hours
Overview
eCharts is a free JavaScript library used for interactive charting and data visualization.

In this instructor-led, live training, participants will learn the fundamental functionalities of ECharts as they step through the process of creating and configuring charts using ECharts.

By the end of this training, participants will be able to:

- Understand the fundamentals of ECharts
- Explore and utilize the various features and configuration options in ECharts
- Build their own simple, interactive, and responsive charts with ECharts

Audience

- Developers

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice
21 hours
Overview
Tableau is a business intelligence and data visualization software. Tableau offers a wide array of developer tools and APIs for integration, customization, automation, and extension of Tableau features for organization-specific needs.

In this instructor-led, live training, participants will learn how to extend the capabilities of Tableau to fit the specific needs of their organization.

By the end of this training, participants will be able to:

- Install and configure Tableau (Desktop, Server, Online.)
- Understand the fundamentals of developing with Tableau.
- Create and publish data visualizations and interactions with Tableau.
- Use Tableau's developer tools and APIs to customize and extend the capabilities of Tableau for their organization.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Note

- To request a customized training for this course, please contact us to arrange.
28 hours
Overview
Talend Open Studio for Data Integration is an open-source data integration product used to combine, convert and update data in various locations across a business.

In this instructor-led, live training, participants will learn how to use the Talend ETL tool to carry out data transformation, data extraction, and connectivity with Hadoop, Hive, and Pig.

By the end of this training, participants will be able to

- Explain the concepts behind ETL (Extract, Transform, Load) and propagation.
- Define ETL methods and ETL tools to connect with Hadoop.
- Efficiently amass, retrieve, digest, consume, transform and shape big data in accordance to business requirements.
- Upload to and extract large records from Hadoop (optional), Hive (optional), and NoSQL databases.

Format of the course

- Part lecture, part discussion, exercises and heavy hands-on practice

Note

- To request a customized training for this course, please contact us to arrange.
14 hours
Overview
In this instructor-led, live training, participants will learn three different approaches for accessing, analyzing and visualizing data. We start with an introduction to RDMS databases; the focus will be on accessing and querying an Oracle database using the SQL language. Then we look at strategies for accessing an RDMS database programmatically using the Python language. Finally, we look at how to visualize and present data graphically using TIBCO Spotfire.

Format of the Course

- Interactive lecture and discussion.
- Lots of exercises and practice.
- Hands-on implementation in a live-lab environment.
14 hours
Overview
This instructor-led, live training (onsite or remote) is aimed at HR professionals and recruitment specialists who wish to use analytical methods improve organisational performance. This course covers qualitative as well as quantitative, empirical and statistical approaches.

Format of the Course

- Interactive lecture and discussion.
- Lots of exercises and practice.

Course Customization Options

- To request a customized training for this course, please contact us to arrange.
Weekend Data Analysis courses, Evening Data Analysis training, Data Analysis boot camp, Data Analysis instructor-led, Weekend Data Analysis training, Evening Data Analysis courses, Data Analysis coaching, Data Analysis instructor, Data Analysis trainer, Data Analysis training courses, Data Analysis classes, Data Analysis on-site, Data Analysis private courses, Data Analysis one on one training

Course Discounts Newsletter

We respect the privacy of your email address. We will not pass on or sell your address to others.
You can always change your preferences or unsubscribe completely.

Some of our clients

is growing fast!

We are looking to expand our presence in Jordan!

As a Business Development Manager you will:

  • expand business in Jordan
  • recruit local talent (sales, agents, trainers, consultants)
  • recruit local trainers and consultants

We offer:

  • Artificial Intelligence and Big Data systems to support your local operation
  • high-tech automation
  • continuously upgraded course catalogue and content
  • good fun in international team

If you are interested in running a high-tech, high-quality training and consulting business.

Apply now!