Posts

Malware is so 2017: five new security trends to watch out for

Outbreaks such as Petya and WannaCry really put the malware threat on the IT agenda and made cybersecurity a priority for everyone. Fredrik Svantes, Senior Information Security Manager at Basefarm, explains the latest developments that keep the cybersecurity community busy.

What is Big Data? – A definition with five Vs

To define where Big Data begins and from which point the targeted use of data become a Big Data project, you need to take a look at the details and key features of Big Data. Its definition is most commonly based on the 3-V model from the analysts at Gartner and, while this model is certainly important and correct, it is now time to add another two crucial factors.

Big Data definition – the three fundamental Vs:

  • Volume defines the huge amount of data that is produced each day by companies, for example. The generation of data is so large and complex that it can no longer be saved or analyzed using conventional data processing methods.
  • Variety refers to the diversity of data types and data sources. 80 percent of the data in the world today is unstructured and at first glance does not show any indication of relationships. Thanks to Big Data such algorithms, data is able to be sorted in a structured manner and examined for relationships. Data does not always comprise only conventional datasets, but also images, videos and speech recordings.
  • Velocity refers to the speed with which the data is generated, analyzed and reprocessed. Today this is mostly possible within a fraction of a second, known as real time.

Big Data definition – two crucial, additional Vs:

  • Validity is the guarantee of the data quality or, alternatively, Veracity is the authenticity and credibility of the data. Big Data involves working with all degrees of quality, since the Volume factor usually results in a shortage of quality.
  • Value denotes the added value for companies. Many companies have recently established their own data platforms, filled their data pools and invested a lot of money in infrastructure. It is now a question of generating business value from their investments.

As we wrote in our previous blog post, defining Big Data is not so easy since the term relates to many aspects and disciplines. And for many people the most important thing is companies’ success (Value), the key to which is gaining new information – which must be available to many users very quickly (Velocity) – using huge amounts of data (Volume) from highly diverse sources (Variety) and of differing quality (Validity), in order to be able to quickly make important decisions to gain or maintain competitive advantage.

In the book “Big Data – Using smart Big Data analytics and metrics to make better decisions and improve performance” Bernard Marr writes that if Big Data ultimately did not result in an advantage then it would be useless. We could not agree more.

 

 

Don’t let big data turn us into Big Brothers

What if Big Brother had access to big data technology? Big data handled without care might easily turn us all unknowingly into Big Brothers or their collaborators.

Although some might see it differently, the society foreseen and described by George Orwell in the dystopian novel 1984 has not become reality. But what if Big Brother had access to big data technology? Big data handled without care might easily turn us all unknowingly into Big Brothers or their collaborators.

Big Brother is symbolic of the totalitarian state Oceania where every citizen was under constant surveillance by the authorities.

Digital computers just recently were invented when 1984 was published in 1949 and were hardly known to people in general. Computers and big data play no real role in the novel, but it is easy to imagine what Big Brother could have done if they were able to capture, curate, manage and process huge volumes of data.

Big Brother better off with computers

Without doubt, Big Brother would be far better off with big data capabilities. They had the manpower but not the necessary data tools.

Today we have those tools. We do not even need the manpower Big Brother possessed. Automatic capture and processing of tremendous amounts of data can be handled by computers, machine learning and Artificial Intelligence, assisted by a few people.

Read more: the Basefarm & *um big data definition.

What we define as big data lakes are key and the resource for big data analyses. The data lake sources can be multiple and the potential results from data analyses can be very interesting.

Huge big data potential

By looking into big data, we can reveal customers and entire societies, including new ways of services distribution and even entire new business models. Basefarm is capable of providing these kinds of analyses.

The view of some big data evangelists is that companies possessing big data capacities might be in position to redefine their entire business. For instance, logistic companies produce enormous amounts of data. Evangelists suggest that these data represent such value that utilizing the data can be core for such companies in the future, not the original logistics business.

Will they also become Big Brothers?

Avoid the dark path

Unless companies are careful, that might very well be the outcome. The path to Big Brother status starts with what data you collect in in the big data lake.

Security and compliance are an integrated part of daily Basefarm operations. The value of this knowledge is even higher in the new world of increasing capabilities to collect, curate, communicate and move data.

Without compliance work, we can all easily step over the threshold and become something far from our intentions.

The Ministry of Love wants your logs

An example of the road to becoming Big Brother is how we handle logs. Infrastructure and application logs are true big data sources. In Basefarm we are enthusiastic about the opportunities provided by these sources. So much information is available to increase production and the customer experience, even leading to completely new ways of serving our customers.

However, logs contain geographical distribution, processing and personal data which are regulated in GDPR. Not least, a big topic is who can access the data. If the logs contain information about personal health, maybe only medical doctors or psychologists are allowed to access them. You definitely don’t want them falling into the hands of Big Brother.

The log data example is interesting. For all Basefarm knows, IT staff might already be unknowingly handling logs in a way that does not comply with regulations.

Unknowing collaborator

No sane person would like to be associated with Big Brother. We do not want to contribute to others becoming Big Brother, and definitely not by them using our data.

To avoid this we need comprehensive control of our data. We need to control where it is, what it contains, who has access and how it is shared with governments, service partners and companies which provide big data services like Basefarm.

The key to avoiding collaborating with Big Brother is to handle your data correctly. It is a priceless asset for your business and you don’t want it falling into the wrong hands.Don’t risk becoming a Big Brother. Through compliance with GDPR and other regulations we can derive huge value from big data.

This might also be interesting

Download Data Thinking whitepaper

Data Thinking addresses the subject areas of our time: data, algorithms, compute and mindset. To comprehensively support companies during times of great complexity and to supervise them with their own digital development.
Learn how you can benefits from Data Thinking.

Watch recording from our GDPR Webinar

May 25 is coming soon: do you know all the responsibilities of data controllers and processers? Listen to our guide and learn who does what!

Data is stupid; using it is clever

The rise of big data opens up new possibilities. By investing in the future and exploring use cases together with customers in many industries, Basefarm creates market leaders.

The rise of big data opens up new possibilities. By investing in the future and exploring use cases together with customers in many industries, Basefarm creates market leaders.

Use cases for big data projects are everywhere. Take, for instance, predictive maintenance in the offshore industry (e.g. wind turbine maintenance) and the merits of the 360-degree customer view in the hospitality industry. But to flourish in this rapidly evolving world, it’s increasingly important to be agile and flexible. Many of Basefarm’s customers face the challenge of mixing and matching agile ways of working (such as DevOps environments) with traditional processes and infrastructures, resulting in a hybrid delivery model and a hybrid business.


“With this in mind, areas such as security, IoT and big data need extra focus,” says Stefan Månsby, Senior Director of Product Management & Big Data at Basefarm. “With our security division, we deliver 24/7 security services. And now we are also helping many of our customers to understand that very often, they have a golden opportunity to apply their domain expertise to their existing wealth of unexplored data.”


Data is the new oil

Businesses themselves are also becoming more data-driven. Companies are becoming more “hybrid” from a technical point of view, mixing and matching traditional and modern IT infrastructures. By making all their data available in one large pool, they embrace a new way of decision-making where companies rely on data science. Often, this opens up new possibilities for non-linear growth, leading to companies crossing the traditional boundaries between industries. A well-known example of this is Tesla. In their mission is to accelerate the world’s transition to sustainable energy, they build solar panels, batteries and electric cars.

A comparison to Tesla doesn’t do much justice to most companies. But this doesn’t mean they shouldn’t embrace big data. The most typical big data use cases show up in manufacturing, service and maintenance. The potential benefits of predictive maintenance, for example, are huge. By collecting and analyzing data from machine parts, it becomes possible to predict failure and to schedule maintenance. One Basefarm customer performs maintenance on wind farms in the Baltic Sea. With only a few ships available that can hoist ball bearings into wind turbines, they save millions of euros every year by letting AI calculate the optimal shipping routes.

You have the data; now use it

There are numerous examples of big data use cases. Månsby: “At Basefarm, we organize workshops with our customers which generate hundreds of ideas and scenarios.”

The next step is often to design a Proof of Concept (PoC) to present to the company’s board.

“Basically, we can go from whiteboard to a working first PoC in 8 to 12 weeks,” Månsby says. “The size of the company doesn’t matter. Whether you are BMW or a small enterprise, it doesn’t make any difference. If your company has a top-heavy culture, for instance, and data science seems a bit too ‘Star Trekky’ for the CXO’s, we sometimes give the CXO access to a small subset of data to play around with on a Notebook. So they get a feel for the possibilities and start to understand that this technology isn’t black magic or an experimental lab product. It’s very real, it’s now and helps you achieve big goals like major improvements in efficiency, becoming more sustainable and finding new revenue streams.”

About Stefan Månsby

Stefan Månsby is Senior Director of Product Management & Big Data at Basefarm. He has a broad experience in the IT industry and has driven change in many organizations throughout the years. His main passion is digital innovation and he is a great photographer and music producer.

Data thinking is the holy grail of organic growth

Where does success come from? Nowadays, data thinking is a key component. It’s the culture that is responsible for SpaceX’s pioneering Falcon Heavy rocket launch as well as the secret behind hotels and bars remembering your favorite drink.

If there is anything that drives the most successful businesses right now, it is the clever use of data. Seen in this light, the acquisition by Basefarm of the Berlin-based The Unbelievable Machine Company (*um), the leading service provider for big data, cloud and managed cloud services in Germany and Austria, comes at exactly the right moment.

“Many of our customers are huge data owners. Data is the asset of the future,” explains Stefan Månsby, Senior Director of Product Management & Big Data at Basefarm. “European companies need to catch up with their North American counterparts. The big boys in Silicon Valley, such as Amazon and Google, are leading the race and there is nothing wrong with that. But some parts of Europe lag almost a decade behind when it comes to big data maturity. This needs to change.”

Great data leads to great ideas

Amongst many other industries, airlines and leisure companies will benefit greatly from having a 360-degree view of the customer. By gaining insight into customer behavior and needs, they can turn the customer’s next flight or stay into a ”super-tailored experience” because they already know the customer’s exact preferences. Even a result as simple as having your favorite drink waiting for you when you arrive at a hotel can make a big difference. But how do you get there as a company? You have to concentrate on data first, by putting all your data in one place.

“The first thing we recommend is what we call ‘data thinking’,” says Månsby. ” You provide the essential hard data so a company can make necessary decisions “Part of this is data science. You test hypotheses and either they make sense and get you the revenue, or they are a bad idea but you learn from it. By investing in such an agile culture, you can set yourself apart from your competitors and gain a market advantage. Focus on the idea of what you would like to do, not how you will technically solve it. The idea will make your business unique and a leader, not the technology.”

Elon Musk: solar panels, batteries, cars and rockets

A big difference between traditional business and business that relies on data thinking lies in the way they evolve. With the latter, this is far from linear. An example is a company that builds self-driving buses. Their core business is to make such vehicles but, once the buses are driving around in cities, the company can start a side business in traffic reports based on the data they have collected. The new revenue streams could potentially even make public transport free for passengers.

“Data thinking enables new opportunities,” Månsby says. “Look at Ikea. Data thinking has made it Sweden’s second largest food exporter. Another example is Tesla, their mission is to accelerate the world’s transition to sustainable energy. Hence, they need to develop the ultimate battery and then apply them in great cars to prove their point. That’s amazing. As a data-thinking company, you have a big advantage over linear competitors.”

Do you want to know more:

Here you find our Data Thinking webinar recordings about AI: https://www.basefarm.com/en/services/big-data

About the Author

Stefan Månsby is Senior Director of Product Management & Big Data at Basefarm. He has a broad experience in the IT industry and has driven change in many organizations throughout the years. His main passion is digital innovation and he is a great photographer and music producer.

Where does Big Data begin? – Many perspectives, one classification

Big Data is a buzz phrase that is used in various situations and is constantly developing.

To classify Big Data decisively is not so easy. Firstly, it is not just a stand-alone term but rather a combination of many aspects to reveal a whole picture. And secondly, Big Data is a buzz phrase that is used in various situations and is constantly developing. It is time to set things straight.

Buzz phrase? Collective term? Synonym?

All of the above. Fundamentally, Big Data represents large digital data volumes as well as the capturing, analyzing and evaluating of it. Therefore, Big Data is also the collective term for all digital technologies, architectures, methods and processes that are required for these tasks. Or as Hasso Plattner says: “Big Data is a synonym for large data volumes in a wide range of application areas as well as for the associated challenge of being able to process them.”

Large data volumes?

Very large. “By the year 2003, humans had created a total of 5 trillion gigabytes of data. In 2011 the same amount was created within 48 hours. Now, creating the same data volume requires just 7 minutes,” illustrated RBB Radioeins in simple and effective terms. Driven by the internet, social networks, mobile devices and the Internet of Things, the worldwide digital data volumes will grow another tenfold by 2020. In Germany alone the current figure of 230 billion GB will rise to 1.1 trillion GB.

This is exactly were Big Data comes into play: The huge data volumes are checked for relationships using a such algorithm, and the whole process requires a combination of several disciplines. “It ranges from traditional informatics and data science to interface design. Machine learningdeep learning and artificial intelligence to mathematics, statistics and data interfaces,” explains Florian Dohmann, Senior Data Scientists at The unbelievable Machine Company. “A lot of this is nothing new, but combining them all creates the basis for new opportunities.”

So it is only about data volumes?

Fundamentally, yes. Big Data is firstly defined by data volumes that are “too large, too complex, change too quickly or are structured too weakly to be analyzed with manual and traditional data processing methods,” according to Wikipedia. But to define where Big Data begins – i.e. from which point the targeted use of data becomes a Big Data project – you need to take a close look at the details.

Big Data Intro-Webinar!

Watch webinar on demand! Big Data inspiration with Big Data Chief Evangelist, Klaas Bollhöfer!

Big Data has become a buzzword over the last years. It is not just a stand-alone term but rather a combination of many aspects to reveal a whole picture.

You might ask why Basefarm in particular is hosting a webinar about Big Data?
We have been a managed service provider of mission critical solutions for years, and are now expanding our business with our acquisition of the German company “The Unbelievable Machine Company”.

Our Big Data expertise is relevant and interesting for a lot of industries – both in operational, developing and “ideation” perspective.
We have reference cases like Deutsche Post, Gebr. Heinemann, Audi, Deutsche Welle, Delivery Hero, Metro Group and Parship.  Read more about *UM here!

In this session you will get an inspiring intro-webinar where we evolve Big Data possibilities presented by Chief Evangelist, Klaas Bollhöfer.
The webinar is for everyone, and you do not need any knowledge about the topic before the session. The session will be in English.

At the end of this session you will have a fundamental understanding of what Big Data is, the challenges that comes with it, why you should start looking into it in 2018 and last but not least – how you can turn your data into business opportunities.

Big Data Chief Evangelist – Klaas Bollhöfer

Klaas Bollhöfer has acted as the Chief evangelist of The unbelievable Machine Company, a Basefarm company, for more than 5 years now, and is pioneering data science in Germany, Europe and beyond. At the interface of business, IT, artificial intelligence and design he develops cutting-edge strategies, spaces, services, teams and sometimes escape routes, and describes himself as a for-, side- and backward thinker. Besides that he is founder and managing director of Birds on Mars, a Berlin-based consultancy exploring and developing the intersections of human and artificial intelligence. The time left is filled with lightning talks, guest lectures, program committee chairs and craft brewing. Klaas is a certified Scrum master, design thinker, mediator and coach and will never stop being curious.

Big data Olympics

Four gold medals and one silver medal during the 2018 Winter Olympics are proof that Jac Orie is a successful speed skating coach. Why? It all has to do with data!

In the ice skating world, the name of Jac Orie is well established. He is the man behind the biggest successes of many Dutch speed skaters. Gerard van Velde in 2002, Marianne Timmer in 2006, Marc Tuitert in 2010 and Stefan Groothuis in 2014: they all won Olympic gold working with Orie. Apart from a mountain of medals, these skaters have left something valuable: a huge amount of data. Advanced analytics on almost two decades worth of data has helped Orie to train his team even more smartly in the run-up to the 2018 Winter Olympic Games in Pyeongchang, South Korea.

Data science

The results of Orie’s big data project have been astounding so far. Millions of viewers all over the world saw Sven Kramer (men’s 5,000 metres), Carlijn Achtereekte (women’s 3,000 metres) and Kjeld Nuis (men’s 1,000 and 1,5000 metres) skating to gold. And Patrick Roest (men’s 5,000 metres) won silver. Less visible is what exactly lies behind these successes. For many years, Orie has been using test data generated by skaters to calculate speed and stamina. For Pyeongchang however, he went one step further and collaborated with Leiden-based data scientist Arno Knobbe.

The big data approach, whereby computing power is used to perform calculations on big volumes of data has led to many useful insights. These include the relation between the type of training and the moment, duration and intensity of the training. A skater who has profited hugely from this is Kjeld Nuis. Data showed that stamina training in the morning proved ineffective for him, leading to an improvement in his training programme – and two gold medals in Pyeongchang.

Supercompensation

For Orie, Knobbe and the skating sport in general, the big data journey is just beginning. For example, the phenomenon of ‘supercompensation’ still needs to be figured out. Supercompensation is what happens when an athlete temporarily lowers the training intensity, leading to recovery of the body and an increase in racing performance. Obviously, this effect needs to be timed perfectly in the run-up to an important race. It’s a complex equation, with the results of training sessions sometimes showing up months later and with training types having different effects on performance for sprinting distances (especially the 500 and 1,000 metres), on the one hand, and longer distances (1,500 metres and above), on the other.

Golden opportunities – everywhere

It is certainly not an exaggeration to say that the 2018 Winter Olympics have become the first big data Olympics. As a best practice, the example set by the Dutch skaters will be followed by other athletes looking to optimize their performance. And it’s not just in sporting events that data thinking is making such an impact. Many companies are becoming more data-driven. At

Basefarm, we work together with some of these companies to explore their existing wealth of unexplored data and find new use cases. In the manufacturing, service and maintenance industries, for instance, the use of predictive maintenance saves companies millions of euros every year. And this is only just the beginning. Undoubtedly, big data will shape the next Olympic games as well as the business world of tomorrow. Our question to you: will you be a contender for gold?

About Ronald Tensen

Ronald Tensen is Marketing Manager at Basefarm in the Netherlands. He has a broad experience in the internet and IT industry (B2B and B2C), successful at developing and launching new consumer services and brands, strong customer focus and of course he is a great team player!