BF-SIRT Newsletter 2018-12

Bitcoins blockchain poisoned

Researchers from the RWTH Aachen University and Goethe University, Germany, have uncovered images and links to child pornography in cryptocurrency Bitcoin’s blockchain. The analysis found that certain content, such as illegal pornography, would render the mere possession of a blockchain illegal, with data distributed to all Bitcoin participants.

Version 7 of CIS Controls released

“CIS Controls Version 7” was released Monday by the Center for Internet Security, including steps for mapping the well-known “high-priority short list” of defensive actions to the National Institute of Standards and Technology’s framework of cybersecurity standards.

 

Top 5 Security links
Pirate Websites Expose Users to More Malware, Study Finds
AMD Will Release the Patches for the Recently Discovered Flaws Very Soon
Dragonfly Compromises Core Router to Attack Critical Infrastructure
Firefox Master Password System Has Been Poorly Secured for the Past 9 Years
EXCLUSIVE: ‘Lone DNC Hacker’ Guccifer 2.0 Slipped Up and Revealed He Was a Russian Intelligence Officer

 

 

(Blogpost image by Stefan Krause, “Glühlampe explodiert“, Free Art License)

Malware is so 2017: five new security trends to watch out for

Outbreaks such as Petya and WannaCry really put the malware threat on the IT agenda and made cybersecurity a priority for everyone. Fredrik Svantes, Senior Information Security Manager at Basefarm, explains the latest developments that keep the cybersecurity community busy.

BF-SIRT Newsletter 2018-11

AMD Vulnerabilities

This week, CTS-Labs sent out an advisory regarding AMD Vulnerabilities.
What’s worth noting about this is that the vulnerabilities all require local administrator access to exploit, and if an attacker already got that access it means that it’s basically game over in either case. There are also concerns that this was done in order to manipulate stock prices, and the fact that CTS-Labs only gave AMD a one day heads up before going public (instead of the regular 30 – 90 days) have set off red flags for some parties.

 

Top 5 Security links
Let’s have a sober look at these ‘ere annoying AMD chip security flaws
APT Hackers Infect Routers to Covertly Implant Slingshot Spying Malware
ISPs Caught Injecting Cryptocurrency Miners and Spyware In Some Countries
Pre-Installed Malware Found On 5 Million Popular Android Phones
Update Samba Servers Immediately to Patch Password Reset and DoS Vulnerabilities

What is Big Data? – A definition with five Vs

To define where Big Data begins and from which point the targeted use of data become a Big Data project, you need to take a look at the details and key features of Big Data. Its definition is most commonly based on the 3-V model from the analysts at Gartner and, while this model is certainly important and correct, it is now time to add another two crucial factors.

Big Data definition – the three fundamental Vs:

  • Volume defines the huge amount of data that is produced each day by companies, for example. The generation of data is so large and complex that it can no longer be saved or analyzed using conventional data processing methods.
  • Variety refers to the diversity of data types and data sources. 80 percent of the data in the world today is unstructured and at first glance does not show any indication of relationships. Thanks to Big Data such algorithms, data is able to be sorted in a structured manner and examined for relationships. Data does not always comprise only conventional datasets, but also images, videos and speech recordings.
  • Velocity refers to the speed with which the data is generated, analyzed and reprocessed. Today this is mostly possible within a fraction of a second, known as real time.

Big Data definition – two crucial, additional Vs:

  • Validity is the guarantee of the data quality or, alternatively, Veracity is the authenticity and credibility of the data. Big Data involves working with all degrees of quality, since the Volume factor usually results in a shortage of quality.
  • Value denotes the added value for companies. Many companies have recently established their own data platforms, filled their data pools and invested a lot of money in infrastructure. It is now a question of generating business value from their investments.

As we wrote in our previous blog post, defining Big Data is not so easy since the term relates to many aspects and disciplines. And for many people the most important thing is companies’ success (Value), the key to which is gaining new information – which must be available to many users very quickly (Velocity) – using huge amounts of data (Volume) from highly diverse sources (Variety) and of differing quality (Validity), in order to be able to quickly make important decisions to gain or maintain competitive advantage.

In the book “Big Data – Using smart Big Data analytics and metrics to make better decisions and improve performance” Bernard Marr writes that if Big Data ultimately did not result in an advantage then it would be useless. We could not agree more.

 

 

BF-SIRT Newsletter 2018-10

Netflix could pwn 2020s IT security – they need only reach out and take

The container is doomed, killed by serverless. Containers are killing Virtual Machines (VM). Nobody uses bare metal servers. Oh, and tape is dead. These, and other clichés, are available for a limited time, printed on a coffee mug of your choice alongside a complimentary moon-on-a-stick for $24.99. Snark aside, what does the future of containers really look like?

  • No one company is going to dominate IT security in the 2020s, but there is an empire to be built on building the very best workload wrapper money can buy.
  • VMware has all components to build this puzzle piece. Unfortunately, they’re trapped in whatever hell befell Microsoft in 2005.
  • Red Hat has most of the required components, but it will probably take them at least a decade to integrate all of it into systemd.
  • Nobody is going to build an empire on containers, because containers are only one part of a more important puzzle piece.
  • Netflix gave the world the Chaos Monkey, and then decided to build a full-scale Simian Army.
  • Which vendor(s) will pull it together and dominate that niche?

 

Top 5 Security links
https://nakedsecurity.sophos.com/2018/03/08/smart-traffic-lights-cause-jams-when-fed-spoofed-data/
https://arstechnica.com/information-technology/2018/03/it-just-got-much-easier-to-wage-record-breaking-ddoses/
https://devco.re/blog/2018/03/06/exim-off-by-one-RCE-exploiting-CVE-2018-6789-en/
https://threatpost.com/pos-malware-found-at-160-applebees-restaurant-locations/130281/
https://www.theregister.co.uk/2018/03/08/dutch_police_detail_how_they_became_the_admins_for_hansa_dark_web_market/

A comparison of public cloud providers: AWS, Azure, and Google. Part 2

Public cloud and Infrastructure-as-a-Service (IaaS) solutions are key to digital business success nowadays. On an enterprise level, Amazon Web Services (AWS)Microsoft Azure, and the Google Cloud Platform are the industry-leading solutions.
In the first part of this post, we focused on AWS. Now, we’re moving on to Azure and Google. We’ll explain the differences between the solutions and look at which solution is the best fit for which type of company.

Microsoft Azure

Microsoft Azure is the second largest Public Cloud Infrastructure as a Service (IaaS) according to Gartner. Microsoft Azure was launched in 2010 as a cloud-based development platform and Platform-as-a-Since (PaaS) but entered the IaaS market a few years later. Microsoft has also transformed from a perpetual license selling organization releasing software yearly to a cloud provider launching new innovative services and features monthly in their cloud platforms. Each year they are closing in on the head start that AWS got, as can be seen in Gartner’s Magic quadrants.

Microsoft has chosen another strategy than AWS when it comes to building out their global presence and made enormous investments in building their hyper scale data centers in a larger number of locations and are as of today available in 36 regions and 6 more announced. In selected regions Microsoft is now investing in Availability Zones and can today be used as a preview service.

Azure has often been chosen as the strategic cloud solution by customers who focus on Microsoft technologies or have a good fit with their cloud strategy which spans IaaS, PaaS, SaaS and the hybrid strategy including Azure Stack. However since 2014 when Microsoft made a strategic shift towards the open source community, Linux is now a first class citizen in Azure and  several PaaS services uses open source (e.g. Hadoop, Kubernetes, Spark).

Enterprises are still the primary focus for Microsoft even though they are reaching out to startups. This is seen through their large local customer teams in all countries, their focus on compliance, [NÅ1] certifications, security and hybrid strategy. With their push in the market we also start to see large enterprises moving their mission critical systems or creating hybrid cloud strategies with Azure.

The speed of innovation in the cloud has a cost, and this cost has been seen in slow support, lack of documentation, training and a slow partner eco system. These issues have been actively addressed and improvements are being done continuously.

Google Cloud Platform

In 2011, Google joined the swathes of providers venturing into the cloud market with its PaaS solutions. Since then, it has also added IaaS services to its portfolio. The Google Cloud Platform now offers all the core functions required for enterprise workloads, and Gartner even rates the company as a leader in the fields of application containers, big data management, and machine learning. Google’s open-source, platform-independent machine learning platform is a unique feature of its portfolio.

Google has added SaaS solutions to its IaaS and PaaS portfolio, including productivity tools that incorporate traditional Office components. Google does not offer any discounts for these tools via enterprise agreements, but instead employs its own program in which customers pay less for each software unit the more they use the cloud platform.

The company’s regional presence is often cited as a limiting factor, but Google is currently expanding its international decentral data center capacity. The other enterprise functions, such as the role-based access controls and user management tools, are, according to Gartner, also ripe for expansion. Google is working on a solution and is expected to have gained ground on AWS and Azure by the time the next Gartner analysis is conducted.

Which cloud provider for which type of company?

The analyses and comparisons highlight differences, individual strengths, and potential areas for expansion. With all these factors to consider, companies seldom choose to put all their eggs in one basket by partnering with a single cloud provider. According to Gartner’s findings, most opt for a hybrid or multi-cloud strategy. Depending on the application area, companies use services from at least two, and sometimes even all three leading public cloud providers.

With this in mind, we recommend that companies develop their own assessment criteria based on their own application scenarios – then use their assessments to decide which public cloud provider is the best fit for their organization. One of the challenges is the many technology choices one has to make when implementing a service or creating an application. Unclear guidance on when to use what makes it harder and more complex to pick the right implementation.

If a hybrid or multi-cloud solution is on the table, it is a good idea to obtain external expert advice. At least for the time being, it is unlikely that a single provider will be able to satisfy all the requirements of a company.

This may interested you, too:
A comparison of public cloud providers: AWS, Azure, and Google (Part 1)

A comparison of public cloud providers: AWS, Azure, and Google. Part 1

If you’re looking for enterprise-level public cloud and Infrastructure-as-a-Service (IaaS) providers, Amazon Web Services (AWS) and Microsoft Azure are likely to be among the first names to spring to mind..

If you’re looking for enterprise-level public cloud and Infrastructure-as-a-Service (IaaS) providers, Amazon Web Services (AWS) and Microsoft Azure are likely to be among the first names to spring to mind. In Gartner’s view, Google is hot on their heels – and these top-three major players are well ahead of others in the field. Here, we take a closer look at the differences between the providers – starting with AWS.

For its latest analysis “Magic Quadrant for Cloud IaaS”, American market research firm Gartner rated the performance of the leading public cloud platforms against 234 criteria. AWS came out on top, achieving 92 per cent of the defined requirements for IaaS services, closely followed by Microsoft with 88 per cent. Google Cloud Platform came in third with a score of 70 per cent, although the internet giant is edging ever-closer to a leader score.

The three main players are all expanding their portfolios, which makes it even more difficult for companies to choose between them. Our comparison might help you decide which cloud provider is in the best position to meet your needs.

Amazon Web Services

Of all the providers, AWS boasts the most mature cloud offering. As a result, it is often the first-choice partner for the majority of applications. One of the main reasons for this popularity is that, alongside its public cloud services in the Infrastructure-as-a-Service (IaaS) sector, AWS also offers a wide range of tools for customers who wish to use Platform-as-a-Service (PaaS) technology to develop, test, and launch their own applications, including DevOps tools and tools for the development of mobile services. AWS also offers Hadoop cluster, data lake, and database options.

Another of the provider’s strengths is its global cloud infrastructure. AWS currently has more than 44 availability zones in 16 regions. Each of these availability zones boasts one or more dedicated data centers; each of these data centers meets the highest security standards and achieves unprecedented levels of reliability. Last year, AWS had more computing power in its cloud than all its competitors put together.

However, the factor that really sets AWS apart from other providers is the scope and depth of its platform. AWS is continually adding new services, technologies, and functions to its offering. At the AWS Re:Invent developer conference the company recently unveiled an impressive new series of tools dedicated specifically to machine learning technology, which is set to burst onto the scene in the very near future.

These tools included the world’s first deep-learning and fully programmable video camera, and a technology that tracks people in videos and can detect and categorize their actions. The company also revealed an analysis tool capable of recognizing 100 different languages and various linguistic units (places, names, people, and more), as well as a fully managed end-to-end service for scalable machine learning models.

To make efficient and effective use of these opportunities – for example, to adapt your own applications for the cloud or to link AWS cloud services to your own IT environment – it is a smart idea to obtain professional advice. With expert support, medium-sized companies can get the guidance they need for a smooth and easy transition to the cloud.

We’ll discuss the differences between the AWS public cloud and IaaS services from Microsoft Azure and the Google Cloud Platform – plus the strengths of each solution and which options are best suited to which types of company – in the second part of this post.

Would you like to know more?

Pls contact Jan Aril Sigvartsen our Cloud Transformation Expert or fill in the form.

Where are you on the cloud journey compared to your competitors? Find out here in our Next step cloud guide!

BF-SIRT Newsletter 2018-09

Memcrashed – Major amplification attacks from UDP port 11211

Over last couple of days we’ve seen a big increase in an obscure amplification attack vector – using the memcached protocol, coming from UDP port 11211.

The general idea behind all amplification attacks is the same. An IP-spoofing capable attacker sends forged requests to a vulnerable UDP server. The UDP server, not knowing the request is forged, politely prepares the response. The problem happens when thousands of responses are delivered to an unsuspecting target host, overwhelming its resources – most typically the network itself.

  • A discovery of a new amplification vector though, allowing very great amplification, happens rarely. This new memcached UDP DDoS is definitely in this category.
  • In total we’ve seen only 5,729 unique source IPs of memcached servers. We’re expecting to see much larger attacks in future, as Shodan reports 88,000 open memcached servers
  • Github DDos incident on 28 Feb 2018, they received at peaks 1.35Tbps via 126.9 million packets per second.
  • Please ensure that your memcached servers are firewalled from the internet!

Top 5 Security links
https://cybersins.com/howto-resposible-disclosure-with-security-txt/
https://www.bleepingcomputer.com/news/security/23-000-users-lose-ssl-certificates-in-trustico-digicert-spat/
https://www.theregister.co.uk/2018/03/01/us_researchers_apply_spectrestyle_tricks_to_break_intels_sgx/
https://nakedsecurity.sophos.com/2018/02/28/single-sign-on-authentication-the-bug-that-let-you-logon-as-someone-else/
https://threatpost.com/bug-in-hp-remote-management-tool-leaves-servers-open-to-attack/130189