Tag Archives: australia

DoS scum attacked one-third of the ‘net between 2015 and 2017

Even CHARGEN services are hosed, daily, says CAIDA study One-third of Internet hosts with IPv4 addresses were subject to denial of service attacks in the last two years.…

Visit link:
DoS scum attacked one-third of the ‘net between 2015 and 2017

Bitter feud between partners as IBM deflects eCensus blame

NextGen, Vocus refute claims of error. A bitter feud has broken out between IBM and its internet service provider partners for the 2016 eCensus as the main contractor tried to deflect blame for the site’s meltdown on August 9 In its first detailed response to the failure, IBM said it had plans in place for the risk of DDoS attacks, but its efforts were to no avail thanks to a failure at an upstream provider. The ABS at the time said it had been forced to take the site offline on Census night following a series of DDoS attacks combined with the failure of the network geoblocking function and the collapse of a router. The statistics body has publicly criticised IBM for failing to properly implement a geoblocking service, which would have halted the international DDoS attack targeted at the Census site. But IBM is now laying blame squarely at the feet of its internet service provider partner NextGen and NextGen’s upstream supplier Vocus for the geoblocking bungle. It claimed NextGen had provided “repeated” assurances – including after the day’s third DDoS attack – that a geoblocking strategy that IBM codenamed ‘Island Australia’ had been correctly put in place. However, when the fourth and biggest DDoS attack of the day hit at around 7:30pm, IBM said it became clear that a Singapore link operated by Vocus had not been closed off, allowing the attack traffic to pass through to the Census site. “Vocus admitted the error in a teleconference with IBM, NextGen and Telstra around 11.00 pm on 9 August 2016,” IBM said. “Had NextGen (and through it Vocus) properly implemented Island Australia, it would have been effective to prevent this DDoS attack and the effects it had on the eCensus site. As a result, the eCensus site would not have become unavailable to the public during the peak period on 9 August 2016.” IBM said while it accepted its responsibility as the head contractor for the eCensus, it could not have avoided using ISPs to provide links for the website. “It is not possible for an IT services company such as IBM to implement the 2016 eCensus without engaging ISPs. It was necessary for IBM to involve the ISPs in the implementation of the geoblocking solution as they have control over their respective data networks and are in a position to block internet traffic originating from particular domains or IP addresses.” IBM did, however, admit what many security experts speculated had occured – that following the fourth DDoS a system monitoring dashboard showed an apparent spike in outbound traffic, causing its staff to wrongly assume data was being exfiltrated from the website, prompting IBM to shut down the website. The contractor also revealed that a configuration error meant a manual reboot of one of its routers – which was needed after the eCensus firewall became overloaded with traffic – took much longer to rectify than it should have, keeping the site offline for a further hour and a half. NextGen, Vocus fight back But Vocus said NextGen was well aware that Vocus would not provide geoblocking services, and had instead recommended its own DDoS protection. IBM declined the offer, Vocus said. NextGen and Vocus instead agreed on remote triggered black hole (RTBH) route advertisements with international carriers. “If Vocus DDoS protection product was left in place the eCensus website would have been appropriately shielded from DDoS attacks,” Vocus said in its submission to the inquiry. Vocus refuted IBM’s claim that it had failed to implement geoblocking, revealing that it had not been made aware of IBM’s DDoS mitigation strategy – including ‘Island Australia’ – until after the fourth attack on August 9. “As a result, any assumption that Vocus was required to, or had implemented Island Australia or geo-blocking including, without limitation … are inaccurate,” Vocus said. “Once Vocus was made aware of the fourth DDoS attack, it implemented a static null route to block additional DDoS traffic at its international border routers within 15 minutes.” Vocus also argued that the fourth DDoS was not as large as IBM claimed, comprising of attack traffic that peaked at 563Mbps and lasting only 14 minutes – which it said was “not considered significant in the industry”. “Such attacks would not usually bring down the Census website which should have had relevant preparations in place to enable it to cater for the expected traffic from users as well as high likelihood of DDoS attacks.” NextGen, in its own submission, claimed it had “strongly recommended” to IBM that it take up a DDoS protection product like that on offer by Vocus, but the contractor declined. The ISP said it was not made aware of details of IBM’s ‘Island Australia’ strategy until six days before the eCensus went live in late July. At that point it told IBM that an IP address range it had provided was part of a larger aggregate network and therefore would not respond to “specific international routing restrictions” if ‘Island Australia’ was implemented. “Nextgen recommended using an alternative IP address range, which would give IBM better control, but this was rejected by IBM,” the ISP said. IBM instead chose to request NextGen’s upstream suppliers apply IP address blocking filters and international remote black holes for 20 host routes. “Nextgen believes that the individual host routes picked by IBM may not be exhaustive, and DDoS attacks could come from other routes in the IP address range (which they did in the third DDoS attack on Census day),” NextGen said. “There were a number of routes without geoblocking during the fourth DDoS attack, and which were not identified during testing, along with the [Vocus] Singapore link.” NextGen said it again offered to implement DDoS protection, this time at its own cost, which IBM agreed to four days after the events of August 9. Source: http://www.itnews.com.au/news/bitter-feud-between-partners-as-ibm-deflects-ecensus-blame-439752

Continue reading here:
Bitter feud between partners as IBM deflects eCensus blame

IBM botched geo-block designed to save Australia’s census

Bureau of Stats says spooks signed off IBM’s plan, but Big Blue mucked something up Australia’s Bureau of Statistics has heavily criticised IBM for the security it applied to the nation’s failed online census, which was taken offline after a distributed denial of service (DDoS) attack that battered a curiously flimsy defensive shield.…

See more here:
IBM botched geo-block designed to save Australia’s census

Linode fends off multiple DDOS attacks

Nowhere near as bad as its ten-day Christmas cracker, but something seems to be afoot Cloud hosting outfit Linode has again come under significant denial of service (DoS) attack.…

View original post here:
Linode fends off multiple DDOS attacks

Pokémon Go Servers Suffer Downtime, Possibly Due to DDoS Attacks

With server issues, Pokémon Go players may have had trouble catching much this weekend and it wasn’t merely due to the tremendously popular game crashing a lot on account of a massive new roll-out. A hacker group has claimed responsibility for the server outage, with DDoS attacks. A hacking group known as PoodleCorp has claimed responsibility for Pokémon Go servers crashing on Saturday, an attack which coincided with a roll-out of the tremendously popular game in 26 new countries. While its claim is yet to be verified, the hacking group has notable targeted several YouTube profiles, including the most followed YouTuber of them all, Pewdiepie. The claim was made via a social media post [1] on PoodleCorp’s Twitter account: PokemonGo #Offline #PoodleCorp The group also re-tweeted another post from the supposed leader of the group, who implied that another bigger attack was also coming. The poster wrote [2] : Just was a lil test, we do something on a larger scale soon . Several users took to social media to complain about the outage during a time when the gaming phenomenon is catching on like wildfire around the world, sending Nintendo share prices skyrocket by 86% in a week’s time. I’m really pissed off that Pokémon Go is down because a group of killjoys decided it would be fun to hack the servers and take them offline. — Meg Bethany Read (@triforcemeg) July 16, 2016 Pokemon GO got DDoS ‘d and DDOS became a trending topic lmao Earlier this week, a security researcher discovered a potentially major security flaw [4] win the application. The augmented reality game has captured the imagination of people around the world, wherein players capture virtual Pokemons before collecting and using them to battle other Pokemons captured by other players. Released on July 7, ten days ago, the application has already been downloaded over 10 million times on Apple and Android devices. A new roll-out saw the game now available in 34 countries, including Australia, the United States and almost all of Europe. Source: http://need-bitcoin.com/pokemon-go-servers-suffer-downtime-possibly-due-to-ddos-attacks/

Visit site:
Pokémon Go Servers Suffer Downtime, Possibly Due to DDoS Attacks