His “sources” told him that the DDoS (distributed denial of service) attack experienced at 7.30pm came from inside Australia, not from the United States as claimed.
He says IBM and the ABS were offered DDoS prevention services from their upstream provider, NextGen Networks, but said they didn’t need it. Their plan was to just ask NextGen to geoblock all traffic outside of Australia in the event of an attack.
When the attack came from inside Australia, they had no defence.
The second part of the official story was that a router failed. Gray’s version has a stuff-up with the firewall(s) which is too technical for me to summarise.
The third element of the official story was that some unauthorised data activity was noticed inside the system. Since a DDoS attack is often used as cover for data exfiltration (stealing), they pulled the plug.
Gray says what happened is that the IBM alerts were actually “offshore-bound system information/logs”.
We’ve never been told officially what these alerts were, just that no data had been stolen, altered or destroyed.
The first and the third parts of this sad tale sound entirely plausible. The second, I wouldn’t know, but if equipment/system failure at this particular time was innocent, then it was an amazing coincidence.
We can only hope that the truth will out.
When governments outsource sometimes they don’t have enough residual internal expertise to properly manage the project. In this case when things went pear-shaped the Australian Signals Directorate, part of the Department of Defence, was called in.
If they don’t have the requisite expertise we are in big trouble.
Earlier post: Census crash