Security | Threat Detection | Cyberattacks | DevSecOps | Compliance

June 2020

Amazon Web Services Mitigated a 2.3 Tbps DDoS Attack

Amazon Web Services (AWS) said that it mitigated a distributed denial-of-service (DDoS) attack with a volume of 2.3 Tbps. In its “Threat Landscape Report – Q1 2020,” AWS Shield revealed that its team members had spent several days responding to this particular network volumetric DDoS attack. In Q1 2020, a known UDP reflection vector, CLDAP reflection, was observed with a previously unseen volume of 2.3 Tbps.

Are airports and airlines prepared for cyber threats post COVID-19?

The COVID-19 pandemic has unveiled numerous vulnerabilities and shortcomings in the airline industry. What’s worse for aviation in particular over other industries is how airports have essentially served as the portal for the virus traveling from one country to another across the globe. As a result of severe travel restrictions implemented by nearly every country, airline companies have been hit hard and forced into a dire financial situation.

Why NHS, UK Healthcare Orgs Need to Boost Their Security in Age of COVID-19

All National Health Service (NHS) and social care organisations in the United Kingdom have always been and will always be a target for bad actors. The nature of their business and the sensitive data they hold make these entities appealing to bad actors who know that legacy systems, and/or, not regularly patched systems, such as those employed by healthcare organizations are easy to penetrate.

What Is the Cyber Kill Chain and How to Use It Effectively

You're probably familiar with the defense-in-depth or castle and moat approach to cybersecurity. It remains a common model that organizations use to think through their information security. However, as organizations have matured they have sought out new models to enable them to better understand how cyber attackers operate and how best to defend against them.

Uncovering Bots in eCommerce Part 3: What Sets Scraper Bots Apart?

Web scraping uses bots to collect large amounts of data from websites. Quite simply to extract content and data from a website. Data that’s publicly available. The scraper bot can then duplicate entire website content elsewhere. Scraper bots, most of the time, are not always bad. Bots are constantly at work behind the scenes making our digital lives run smoothly. They are usually looking for information that you are freely giving to your website’s visitors.