Cybersecurity: A paradigm shift in last decade | Adamas University

Cybersecurity: A paradigm shift in last decade

Cybersecurity-A-paradigm-shift-in-last-decade Career

Cybersecurity: A paradigm shift in last decade

In 2010, the world was a different place: The global population had only recently surpassed 7 billion (it is now 7.7), Lance Armstrong was still pure, and Game of Thrones hadn’t even debuted on HBO. We saw the world differently back then because our lives were viewed through different lenses. How have those glasses changed decade later? What has changed in cybersecurity? Here are really a some of the shifts which have had a significant impact on the how we surf the web — and share (or just don’t share) our data.

  • Personal information seems to have become ubiquitous as a result of social media. In 2010, Facebook was expanding, and Instagram was just getting started. Since then, social media has grown from occupying a small portion of our time to occupying the extra portion. Reports in 2010 measured the average user’s time on social media in hours per month; now, we measure in hours per day (2 hours and 16 minutes on average, to be exact). This widespread use has resulted in more than just likes and follows; it has made user information the most precious product on the current market. This vast amount of information has reshaped the way hacktivists operate: whereas in the earlier days, attackers could have attacked network infrastructure and developed ways to get around firewalls, they have noticed that most people are far more obvious targets. Because personally identifiable information (PII) is freely available on the web, data skimming leads to increasingly successful phishing strategies. A few persons take measures to protect their data, while others do not, but nearly everyone has accepted the fact that if you really want to participate in society these days, you will more or less certainly be on social media. Thus, your personal information is publicly available.
  • The Snowden Effect altered our perception of data. The Guardian revealed in 2013 that the NSA was conducting surveillance on millions of Americans. Edward Snowden was subsequently identified as the story’s whistle-blower, and whether you regard him a role model or an antagonist, the impact of his actions on American culture was profound. Many people realised, for the first time, how easily personal data can be shared and spread without their consent. People understood how priceless can ‘metadata’ become.
  • The first ransomware was released in 2017. The ransomware WannaCry infected over 200,000 computers in 150 countries, holding computers hostage and demanding Bitcoin payments to return them to their owners. It was the first instance of that type of virus, as well as the biggest and most wide – ranging ransomware attack ever. This pushed people to confront the fact of the threats they were experiencing. Data breaches were no longer restricted to large corporations; this was an invasion that also damaged personal computers. WannaCry ushers in a new age of data risk, forcing users to sit up and take notice more than ever before.
  • Multi-factor authentication has become extremely prevalent. Technically, ‘Multi Factor Authentication’ (MFA) has existed for centuries – it is simply requiring multiple verified points of identity to allow access towards something. Multi Factor Authentication’s web use existed prior to 2010, but it was not until the last decade or so, following the numerous massive data breaches, that professionals started calling for further widespread use. If users log in via a new device, Google will now need a minimum 2FA, and many other websites are heading the same way. This is in part considering the sheer amount of information we have online, but it is also due to the number of high-profile cyberattacks and breaches that have begun to put the threat into perception for several account holders. MFA makes data protection much more beneficial, and it is now almost mandatory for everyone who are serious about maintaining their information confidential.
  • The internet raised a generation. Youngsters currently have all been born after the 2000s, with the new generation practically coming of age with smartphones in their hands. They have just never recognised a world without the world wide web; this is as much a part of the daily routine as bikes, TV, automobiles, if not more so. Many individuals accept that the younger generation is at significant risk as a result of this; they grew up in the digital age and, as a result, have a far too trusting perspective. Because their data has been accessible online, they may not have considered how to safeguard it. Recent studies, however, show that teens today are far more aware of their internet privacy than previous generations. In comparison, only 32.5 percent of those around 65 years of age have reviewed or adjusted their internet privacy configurations. Yes, such teenagers have grown up in the digital age, but it is precisely why they are more conscious of the risks. If kids are the future, our coming years may be safer and more protected than we thought.

With cybersecurity, the last decade has seen a sea change. So much information is available on the internet, more data is vulnerable, and more data is safeguarded by evolving security measures. We’re learning on the job, and in the information age, we’re learning how to secure sensitive information that so many actors (both good & evil) would like to reach. Will we be able to look back in 2030 and see more of the same? Is there new regulations? New kinds of vulnerabilities, and new social networking sites? Probably. And yet, if indeed the youth of today is any obvious sign, we’ll be ready to confront each one.

Visited 474 times, 1 Visit today

Skip to content