Tag Archives: ddos news

Education sector is fastest growing for DDoS mitigation

The education sector is the fastest growing segment in taking up distributed denial of service (DDoS) mitigation, according to DDoS protection services firm DOSarrest. The firm’s CTO Jag Bains told Computing that many companies -not just e-commerce firms – are deploying DDoS protection. “If their website goes down as a result of an attack, they can lose their SEO ranking or it could have an effect on their brand, there is a lot at stake aside from revenues,” he said. And despite there not being a particular industry that looks at DDoS protection as a must, DOSarrest’s general manager, Mark Teolis claimed that the education sector is one area which has grown significantly. “Our fastest growing segment in the last six months is the education sector believe it or not,” he said. Teolis explained that the firm was getting business from “schools from the UK, the US and international universities” but said he couldn’t identify a specific reason as to why the sector has shown a sudden interest. Bains believes that it may be as a result of educational institutes guarding themselves against their own students. “Students have easy access to DDoS tools, so they may want to try it against their own [school or university]. They could be motivated because they’re failing in something, and there are enough smart kids around to access tools – it is easy to Google them anyway,” he said. But Teolis said that the tools have been available on the internet for a long time, so questioned why there was a sudden surge in interest from educational institutes. Bains suggested that it could be because the school and university websites have become an integral part of the education system. “We’ve been talking about e-commerce and gaming [as being key industries for DDoS protection], but web presence itself is very important and schools and universities need to make their websites accessible. They need a website to give out grades, information and schedules – five years ago they weren’t really using the web page apart from explaining where the school is located,” he said. But while the education sector may be taking a keen interest, Teolis claims that there is not one segment that is “taking up 30 per cent of the market”. He said that “10 or 15 per cent of the market is as good as it gets”. As for a particular industry that has not taken DDoS as seriously as others, Teolis believes many e-commerce firms haven’t contemplated being the victim of a DDoS attack. “There are still the odd e-commerce guys out there [who haven’t taken it as seriously]. Money is rolling in and they’re just focused on that; DDoS for them is somebody else’s problem. A lot of it is ‘my ISP will deal with it’, the fact of the matter is, it is difficult to stop all of the attacks,” he said. Source: http://www.computing.co.uk/ctg/news/2325009/education-sector-is-fastest-growing-for-ddos-mitigation-dosarrest

See the original article here:
Education sector is fastest growing for DDoS mitigation

Don’t be a DDoS dummy: Patch your NTP servers, plead infosec bods

Popular attack method could be stopped with a config tweak Security researchers have responded to recent denial of service attacks against gaming websites and service providers that rely on insecure Network Time Protocol servers by drawing up a list of vulnerable systems.…

Continued here:
Don’t be a DDoS dummy: Patch your NTP servers, plead infosec bods

E-toll site weathers denial of service (DDoS) attack

Sanral’s e-toll Web site suffered a denial of service (DoS) attack on Friday, according to the agency. “Some users complained of slow site performance, and our service provider traced the problem to a denial of service attack of international origin,” said Sanral spokesman Vusi Mona. No further details of the attack were available, but Alex van Niekerk, project manager for the Gauteng Freeway Improvement Project, said the site has come under repeated attack since going live, but suffered only minor performance degradation. DoS attacks, particularly distributed denial of service (DDoS) attacks, are a popular technique used to knock sites offline, overwhelming them with traffic until they are unable to service their clients. Activist group Anonymous frequently uses DDoS to attack targets, using its wide base of supporters to generate traffic. Botnets often launch DDoS attacks from their installed base of zombie PCs. And last year, anti-spam service Spamhaus suffered one of the largest DDoS attacks in history, with incoming traffic peaking at 300Gbps, launched by a Dutch Web host known for harbouring spammers. Sanral’s Web site has been the target of several attacks lately, including a hack which may have leaked personal information, a flaw which allowed motorists to be tracked in real-time, and a session fixation attack which allowed login sessions to be hijacked. Source: http://www.itweb.co.za/index.php?option=com_content&view=article&id=70192:e-toll-site-weathers-denial-of-service-attack

See more here:
E-toll site weathers denial of service (DDoS) attack

SPAM supposedly spotted leaving the fridge

Internet of Things security scares already need to take a chill pill It’s still silly season, it seems. Tell the world that a bunch of small business broadband routers have been compromised and recruited into botnets, and the world yawns.…

Continue reading here:
SPAM supposedly spotted leaving the fridge

US-CERT warns of NTP Amplification attacks

US-CERT has issued an advisory that warns enterprises about distributed denial of service attacks flooding networks with massive amounts of UDP traffic using publicly available network time protocol (NTP) servers. Known as NTP amplification attacks, hackers are exploiting something known as the monlist feature in NTP servers, also known as MON_GETLIST, which returns the IP address of the last 600 machines interacting with an NTP server. Monlists is a classic set-and-forget feature and is used generally to sync clocks between servers and computers. The protocol is vulnerable to hackers making forged REQ_MON_GETLIST requests enabling traffic amplification. “This response is much bigger than the request sent making it ideal for an amplification attack,” said John Graham-Cumming of Cloudflare. According to US-CERT, the MON_GETLIST command allows admins to query NTP servers for traffic counts. Attackers are sending this command to vulnerable NTP servers with the source address spoofed as the victim. “Due to the spoofed source address, when the NTP server sends the response it is sent instead to the victim. Because the size of the response is typically considerably larger than the request, the attacker is able to amplify the volume of traffic directed at the victim,” the US-CERT advisory says. “Additionally, because the responses are legitimate data coming from valid servers, it is especially difficult to block these types of attacks.” To mitigate these attacks, US-CERT advises disabling the monlist or upgrade to NTP version 4.2.7, which also disables monlist. NTP amplification attacks have been blamed for recent DDoS attacks against popular online games such as League of Legends, Battle.net and others. Ars Technica today reported that the gaming servers were hit with up to 100 Gbps of UDP traffic. Similar traffic amounts were used to take down American banks and financial institutions last year in allegedly politically motivated attacks. “Unfortunately, the simple UDP-based NTP protocol is prone to amplification attacks because it will reply to a packet with a spoofed source IP address and because at least one of its built-in commands will send a long reply to a short request,” Graham-Cumming said. “That makes it ideal as a DDoS tool.” Graham-Cumming added that an attacker who retrieves a list of open NTP servers, which can be located online using available Metasploit or Nmap modules that will find NTP servers that support monlist. Graham-Cumming demonstrated an example of the type of amplification possible in such an attack. He used the MON_GETLIST command on a NTP server, sending a request packet 234 bytes long. He said the response was split across 10 packets and was 4,460 bytes long. “That’s an amplification factor of 19x and because the response is sent in many packets an attack using this would consume a large amount of bandwidth and have a high packet rate,” Graham-Cumming said. “This particular NTP server only had 55 addresses to tell me about. Each response packet contains 6 addresses (with one short packet at the end), so a busy server that responded with the maximum 600 addresses would send 100 packets for a total of over 48k in response to just 234 bytes. That’s an amplification factor of 206x!” Source: http://threatpost.com/us-cert-warns-of-ntp-amplification-attacks/103573

View the original here:
US-CERT warns of NTP Amplification attacks

Dropbox hits by DDoS attack, but user data safe; The 1775 Sec claims responsibility

Dropbox website went offline last night with a hacking collecting calling itself The 1775 Sec claiming responsibility of the attack on the cloud storage company’s website. The 1775 Sec took to twitter just a few moments before Dropbox went down on Friday night claiming that they were responsible. “BREAKING NEWS: We have just compromised the @Dropbox Website http://www.dropbox.com #hacked #compromised” tweeted The 1775 Sec. This tweet was followed by a another one wherein the group claimed that it was giving Dropbox the time to fix their vulnerabilities and if they fail to do so, they should expect a Database leak. The group claimed that the hack was in honour of Aaron Swartz. Dropbox’s status page at the time acknowledged that there was a downtime and that they were ‘experiencing issues’. The hackers then revealed that their claims of a Database leak was a hoax. “Laughing our asses off: We DDoS attacked #DropBox. The site was down how exactly were we suppose to get the Database? Lulz” tweeted The 1775 Sec. The group claimed that they only launched a DDoS attack and didn’t breach Dropbox security and didn’t have access to Dropbox user data. Dropbox claimed that its website was down because of issues during “routine maintenance” rather than a malicious attack. In a statement Dropbox said “We have identified the cause, which was the result of an issue that arose during routine internal maintenance, and are working to fix this as soon as possible… We apologize for any inconvenience.” Just over an hour ago, Dropbox said that its site was back up. “Dropbox site is back up! Claims of leaked user info are a hoax. The outage was caused during internal maintenance. Thanks for your patience!” read the tweet from Dropbox. Source: http://www.techienews.co.uk/974664/dropbox-hits-ddos-user-data-safe-1775-sec-claims-responsibility/

Read More:
Dropbox hits by DDoS attack, but user data safe; The 1775 Sec claims responsibility

Could Cross-site scripting (XSS) be the chink in your website’s armour?

Sean Power, security operations manager for DOSarrest Internet Security , gives his advice on how businesses that rely heavily on their web presences can avoid (inadvertently) making their users susceptible to malicious attackers. Cross-site scripting, otherwise commonly known as XSS, is a popular attack vector and gets its fair share of the limelight in the press, but why is it such a problem and how is it caused? Essentially, XSS is a code vulnerability in a website that allows an attacker to inject malicious client-side scripts into a web page viewed by a visitor. When you visit a site that has been compromised by a XSS attack, you will be inadvertently executing the attacker’s program in addition to viewing the website. This code could be downloading malware, copying your personal information, or using your computer to perpetuate further attacks. Of course, most people don’t look at the scripting details on the website, but with popular wikis and web 2.0 content that is constantly updated and changed, it’s important to understand the ramifications from a security stand point. In order for modern websites to be interactive, they require a high degree of input from the user, this can be a place for attackers to inject content that will download malware to a visitor or enslave their computer, and therefore it is hard to monitor an ‘open’ area of the website and continually update and review their websites. XSS code can appear on the web page, in banner ads, even as part of the URL; and if it’s a site that is visited regularly, users will as good as submit themselves to the attacker.  In addition, as XSS is code that runs on the client side, it has access to anything that the JavaScript has access to on the browser, such as cookies that store information about browsing history. One of the real concerns about XSS is that by downloading script on a client-side computer, that endpoint can become enslaved into a botnet, or group of computers that have been infected with malware in order to allow a third party to control them, and used to participate in denial of service attacks. Users might not even be aware that they are part of an attack. In a recent case, we identified how a popular denial of service engine called ‘JSLOIC’ was used as script in a popular website, making any visitor an unwitting participant in a denial of service attack against a third party for as long as that browser window remained open. The range of what can be accomplished is huge- malware can be inserted into a legitimate website, turning it into a watering hole that can infect a visitor’s computer; and this can impact anyone. Once the XSS is put into a website, then the user becomes a victim and the attacker has is all of information that the browser has. In terms of preventing it; firstly, the hole in the website that has been exploited has to be closed.  The main tactic to prevent XSS code running on your website is to make sure you are ‘locking all the doors’ and reviewing your website code regularly to remove bugs and any vulnerabilities. If you are doing it properly, it should be a continual process. If a website has malware on it due to the owner not reviewing it regularly, then attackers will be able alter the malicious code to dominate the page and infect more visitors. You can limit the chances of getting malicious code on your website by routinely auditing the website for unintended JavaScript inclusions. But with XSS, especially non-persistent XSS, the best thing is to validate all data coming in, don’t include any supporting language and make sure what is coming in is sanitised, or checked for malicious code. This is especially true for parts of your website that get regular updates, like comment sections. It is not enough to just assume that because it clean before, new updates will also be also be clear. Even if you are following proper security coding and go through code reviews, websites are sometimes up for six months with no changes made, that is why vulnerability testing is important as new bugs come up. Remember, HTTP and HTML are full of potential vulnerabilities as the HTML protocol was written in the 1960s; it was never imagined it to be what it has become. So when writing website code, if you do not consider SQL Injection or XSS, then you will write a website full of holes. Top three tips: – Review your website and sanitise your code regularly to ensure there is no malicious code or holes where code can be inserted. – Consider not allowing comments to host external links, or even approve those links before they are published to prevent  code from being inserted easily. – View your web traffic in and out of your website for signs of unusual behaviour. Source: http://www.information-age.com/technology/security/123457575/could-xss-be-the-chink-in-your-website-s-armour-

See original article:
Could Cross-site scripting (XSS) be the chink in your website’s armour?