Posts for: #Malaysia

Contact Tracing Apps: they’re OK.

I thought I'd write down my thoughts on contact tracing apps, especially since a recent BFM suggested 53% of Malaysians wouldn't download a contact tracing app due to privacy concerns. It's important for us to address this, as I firmly believe, that contact tracing is an important weapon in our arsenal against COVID-19, and having 54% of Malaysians dismiss outright is concerning.

But first, let's understand what Privacy is.

Privacy is Contextual

Privacy isn't secrecy. Secrecy is not telling anyone, but privacy is about having control over who you tell and in what context.

For example, if you met someone for the first time, at a friends birthday party, it would be completely rude and unacceptable to ask questions like:

  • What's your weight?
  • What's your last drawn salary?
  • What's your age?

In that context you're unlikely to find someone who will answer these questions truthfully.

But...

Age and weight, are perfectly acceptable questions for a Doctor to ask you at a medical appointment, and your last drawn salary is something any company looking to hire you will ask. We've come to accept these questions as OK -- under these contexts.

You might still not want to answer them, which might mean you don't get the job, or the best healthcare -- but you certainly can't be concerned by them. Far more people will answer these same questions truthfully if you change the context from random stranger at a party to doctors appointment.

So privacy is contextual, to justify concerns we have to evaluate both the context and the question before coming to a conclusion.

So let's look at both, starting with the context:

[Read more]

The problem with Grab

As a company, Grab has done enormously well for itself, and naturally will be the target of some hate.

But I think there's a deeper issue with Grab that needs addressing before it becomes an unsolvable problem.

Grab is a win-win

Let's start with what makes Grab so appealing.

Grab (at least in my mind) is the highest paying hourly wage job in the country. As long as you possess a car, and a valid driving license you can be a Grab driver, earning significantly more than any other hourly wage job.

According to this WOB article (which looks suspiciously like a paid ad), the average Grab driver earns RM5,000 per month, which is crazy money for a unskilled job -- and yes driving Grab is unskilled labour.

For unskilled work in Malaysia, earning RM5,000 per month is a god-send, after all even graduate employees don't earn that much. And like all hourly wage jobs, the more hours you put in, the more money they make -- 5,000 is just where it starts

So this seems like a win-win for everyone, drivers get to earn, and at the same time provide a service that is in high demand.

And in truth, Grab is a win-win -- at least for now.

Fast-forward

The problem is that when you fast-forward 10 years, or just 2 elections from now.

Most Grab drivers I've met aren't doing this part-time. They're driving as a full-time job, and they're putting in serious hours (10-12 a day) to make serious money. That means they've no time or to up-skill themselves, because every hour learning a new skill is an hour they could have been driving.

The cost of learning to them is a double-whammy, first they spend on acquiring the new skill (like everybody else), but also the lose income from their not driving. This for most, will be too high a price to pay.

You might argue that driving isn't un-skilled. But all it takes to be a Grab driver is a driving license and a car, skills don't factor into this. Grab doesn't care if you're a PhD, diploma holder or SPM drop-out, it'll pay the same.

Grab views all of it's drivers as a supplier of the one commodity it needs -- cars to move passengers. The only time Grab pays more to drivers is when they turn on the auto-accept feature, because that makes their algorithm more efficient. The more subservient you are to the algorithm, the better it will reward you -- that is a pretty nasty feeling.

So as more folks join the Grab band-wagon, we're sucking out skilled labour from the job-market. Leaving the entire country, as a whole, worse off in terms of competitiveness. But we're just getting started.

[Read more]

The Malaysian Government isn’t watching your porn habits

Recently, there was a poorly written article in The New Straits Times, that suggested the Malaysian Police would know if you were watching porn online.

Let me cut to the chase, the article is shit.

The software in question, aptly named Internet Crime Against Children Child Online Protective Services (ICACCOPS) is used to detect Child Pornography, and Child Pornography only – as the name clearly implies. It is a collaborative effort by Law Enforcement agencies, and is shared with PDRM, probably as a gesture of good will, and also a collaborative effort.

[Read more]

Security Headers for Gov-TLS-Audit

Gov-TLS-Audit got a brand new domain today. No longer is it sharing a crummy domain with sayakenahack (which is still blocked in Malaysia!), it now has a place to call it’s own.

The domain cost me a whooping $18.00/yr on AWS, and involved a couple hours of registration and migration.

So I felt that while migrating domains, I might as well implement proper security headers as well. Security Headers are HTTP Headers that instruct the browser to deny or allow certain things, the idea being the more information the site tells the browser about itself, the less susceptible it is to attack.

I was shocked to find out that Gov-TLS-Audit had no security headers at all! I assumed AWS (specifically CloudFront) would take care of ‘some’ http headers for me – I was mistaken. Cloudfront takes care of the TLS implementation, but does not implement any security header for you, not even strict-transport-security which is TLS related.

So unsurprisingly, a newly created cloudfront distribution, using the reference AWS implementation, fails miserably when it comes to security headers.

I guess the reason is that HTTP headers are very site-dependant. Had Cloudfront done it automatically, it might have broken a majority of sites And implementing headers is one thing, but fixing the underlying problem is another – totally bigger problem.

But what security headers to implement?

[Read more]

The GREAT .my outage of 2018

[caption id=“attachment_6436” align=“aligncenter” width=“550”]

.my DNSKEY Failure

Boy, that’s a lot of RED![/caption]

Last week, MyNic suffered a massive outage taking out any website that had a .my domain, including local banks like maybank2u.com.my and even government websites hosted on .gov.my.

Here’s a great report on what happened from IANIX. I’m no DNSSEC expert, but here’s my laymen reading of what happened:

  1. .my uses DNSSEC
  2. Up to 11-Jun,.my used a DNSKEY with key tag:25992
  3. For some reason, this key went missing on the 15-Jun, and was replaced with DNSKEY key tag:63366. Which is still a valid SEP for .my
  4. Unfortunately, the DS record on root, was still pointing to key tag:25992
  5. So DNSSEC starting failing
  6. 15 hours later, instead of correcting the error, someone tried to switch off DNSSEC removing all the signatures (RRSIG)
  7. But this didn't work, as the parent zone still had a DS entry that pointed to key tag:25992 and hence was still expecting DNSSEC to be turned on.
  8. 5 hours after that, they added back the missing DNSKEY key tag:25992 (oh we found it!), but added invalid Signatures for all entries -- still failing.
  9. Only 4 hours after that did they fix it, with the proper DS entry on root for DNSKEY key tag:63366and valid signatures.
  10. That's a 24 hour outage on all .my domains.
So basically, something broke, they sat on it for 15 hours, then tried a fix, didn't work. Tried something else 5 hours after that, didn't work again! And finally after presumably a lot of praying to the Gods of the Internet and a couple animal sacrifices, managed to fix it after a 24-hour downtime.

I defend my fellow IT practitioners a lot on this blog, but this is a difficult one. Clearly this was the work of someone who didn’t know what they were doing, and refused to ask for help, instead tried one failed fix after another which made things worse. As my good friend Mark Twain would say – it’s like a Mouse trying to fix a pumpkin.

I don’t fully understand DNSSEC (it’s complicated), but I’m not in charge of a TLD. It’s unacceptable that someone could screw up this badly – and for that screw up to impact so many people, and all we got was a lousy press release.

The point is, it shouldn’t take 24 hours to resolve a DNSSEC issue, especially when it’s such a critical piece of infrastructure. I’ve gone through reports of similar DNSSEC failures, and in most cases recovery takes 1-5 hours. The .nasa.gov TLD had a similar issue, that was resolved in an hour, very rarely do we see a 24 hour outage, so what gives?

I look forward to an official report from MyNIC to our spanking new communications ministry, and for that to be shared to the public.

[Read more]

The Malaysian Ministry of Education Data Breach

Ok, I’ve been pretty involved in the latest data breach, so here’s my side of the story.

At around 11pm last Friday, I got a query from Zurairi at The Malay Mail, asking for a second opinion on a strange email the newsdesk received from an ‘anonymous source’. The email was  regular vulnerability disclosure, but one that was full of details, attached with an enormous amount of data.

This wasn’t a two-liner tweet, this was a detailed email with outlined sub-sections. It covered why they were sending the email, what the vulnerable system was, how to exploit the vulnerability and finally (and most importantly!) a link to a Google Drive folder containing Gigabytes of data.

The email pointed to a Ministry of Education site called SAPSNKRA, used for parents to check on their children’s exam results. Quick Google searches reveal the site had security issues in the past including one blog site advising parents to proceed past the invalid certificate warning in firefox. But let’s get back to the breach.

My first reaction was to test the vulnerability, and sure enough, the site was vulnerable to SQL Injection, in exactly the manner specified by the email. So far email looked legitimate.

Next, I verified the data in the Google Drive folder, by downloading the gigabytes of text files, and checking the IC Numbers of children I knew.

I further cross-checked a few parents IC numbers against the electoral roll. Most children have some indicator of their fathers name embedded in their own, either through a surname or the full name of the father after the bin, binti, a/l or a/p. By keying in the fathers IC number, and cross-referencing the fathers name against what was in the breach, it was easy to see that the data was the real deal.

So I called back Zurairi and confirmed to him that the data was real, and that the site should be taken offline. I also contacted a buddy of mine over at MKN, to see if he could help, and Zurairi had independently raised a ticket with MyCert (a ticket??!!) and tried to contact the Education Minister via his aide.

Obviously neither Zurairi nor myself, or any of the other journalist I kept in touch with, could report on the story. The site was still vulnerable, and we didn’t want someone else breaching it.

The next morning, I emailed the anonymous source and asked them to take down the Google Drive, explaining that the breach was confirmed, and people were working to take down the site. Hence there was no reason to continue exposing all of that personal information on the internet.

They agreed, and wiped the drive clean, and shortly after I got confirmation that the SAPSNKRA website had been taken down. So with the site down, and the Google Drive wiped cleaned, it seemed the worst was behind us.

Danger averted…at least for now.

But, since Data breaches last forever, and this was a breach, we should talk about what data was in the system. Zurairi did a good job here, but here’s my more detail take on the issue.

[Read more]

3 times GovTLS helped fixed government websites

Couple months back I started GovTLSAudit. A simple service that would scan  .gov.my domains, and report on their implementation of TLS. But the service seems to have benefits above and beyond that, specifically around having a list of a government sites that we can use to cross-check against other intel sources like Shodan (which we already do daily) and VirusTotal.

So here’s 3 times GovTLSAudit helped secure government websites.

That time Yayasan Islam Terengganu was used a phishing website

I used virustotal's search engine to see if they had extra .gov.my domains to scan, and found a few rather suspicious looking urls including:
paypal-security-wmid0f4-110ll-pp16.yit.gov.my appleid.corn-security2016wmid7780f4-110ll-16.yit.gov.my paypal-security-wmid7110f4-110ll-pp16.yit.gov.my
This was an obvious phishing campaign being run out of a .gov.my domain. Digging further, I found that the IP address the malicious urls resolve to was local, and belonged to Exabytes. And while the root page was a bare apache directory, buried deep within the sites sub-directories was a redirect that pointed to a Russian IP.

I took to twitter to report my findings – I kinda like twitter for this, and the very next day Exabytes come back with a followup that they were fixing it. That’s good, because having a phishing campaign run on .gov.my infrastructure isn’t exactly what you’d like.

There’s a lot more details in the tweet about how I investigated this,– click here to follow the thread. A warning though – I regularly delete my old tweets. So get it while it’s there :).

[Read more]

Look ma, Open Redirect on Astro

If you’ve come here from a link on twitter – you’d see that the address bar still says login.astro.com.my, but the site is rendering this page from my blog. If not, click this link to see what I mean. You’ll get something like this:

Somehow I’ve managed to serve content from my site on an astro domain. Rest assured, I haven’t ‘hacked’ astro servers and uploaded my page, but I’ve performed an equally sinister attack called open redirect.

[Read more]

Here’s one thing that’s already changed post GE14

In 2015, I was invited to a variety program on Astro to talk about cybersecurity.

This was just after Malaysian Airlines (MAS) had their DNS hijacked, but I was specifically told by the producer that I could NOT talk about the MAS hack, because MAS was a government linked company, and they couldn’t talk bad about GLCs.

Then half-way through the interview they asked me about government intervention, and I said something to the effect of “Governments are part of the problem and should refrain from censoring the internet”, that sound-bite never made it to TV because it was censored.

[Read more]

Gov TLS Audit : Architecture

Last Month, I embarked on a new project called GovTLS Audit, a simple(ish) program that would scan 1000+ government websites to check for their TLS implementation. The code would go through a list of hostnames, and scan each host for TLS implementation details like redirection properties, certificate details, http headers, even stiching together Shodan results into a single comprehensive data record. That record would inserted into a DynamoDB, and exposed via a rest endpoint.

Initially I ran the scans manually Sunday night, and then uploaded the output files to S3 Buckets, and ran the scripts to insert them into the DB.

But 2 weeks ago, I decided to Automate the Process, and the architecture of this simple project is complete(ish!). Nothing is ever complete, but this is a good checkpoint, for me to begin documenting the architecture of GovTLS Audit (sometimes called siteaudit), and for me to share.

What is GovTLS Audit

First let's talk about what GovTLS Audit is -- it's a Python Script that scans a list of sites on the internet, and stores the results in 3 different files, a CSV file (for human consumption), a JSONL file (for insertion into DynamoDB) and a JSON file (for other programmatic access).

A different script then reads in the JSONL file and loads each row into database (DynamoDB), and then uploads the 3 files as one zip to an S3 bucket.

On the ‘server-side’ there are 3 lambda functions, all connected to an API Gateway Endpoint Resource.

  • One that Queries the latest details for a site [/siteDetails]
  • One that Queries the historical summaries for the site [/siteHistory]
  • One that List all scan (zip files) in the S3 Bucket [/listScans]
Finally there's a separate S3 bucket to serve the 'website', but that's just a simple html file with some javascript to list all scan files available for download. In the End, it looks something like this (click to enlarge):

[Read more]