[Research] Phishermans Friend – Getting control over a phishing backend

Dear Readers, once in a while I enjoy blogging about things unrelated to bug bounties. And so, as it happens, on a quiet Thursday night as I was about to go to bed, I received the following e-mail:

It roughly translates to: Your Attention is needed, there has been an unwanted login from a location near Berlin.

Hmmm unwanted login from a location near Berlin? My younger brother lives in Berlin, I wondered if he logged in to my PayPal Account. I doubted it, so I decided to visit the link in the email. Clicking on it brought me to the following site:

Bildschirmfoto 2016-06-10 um 11.11.56

Interesting, http://paypal.de-conflict.ru/ <- as you probably notice, this is definitely not something we should trust, it’s a phishing site. So ,as you may or may not know, I like to use the awesome tool dirbuster. After firing it up and targeting this address, I quickly find some juicy stuff:

  • info.php <- PHP Info
  • /classes/ <- misconfigured folder
  • /backend/ <- Login Form
  • /backend/install <- ;-)….

I hope this makes you smile too. First, I looked at the info.php and the system description revealed: Linux fox.hidden-server.ru 2.6.32-673.8.1.lve1.4.3.el6.x86_64 #1 SMP Wed Feb 10 08:57:30 EST 2016 x86_64.

So, based on the .ru, seems it’s a Russian service for criminal activities.

The directory  /backend/ was just a simple login form asking for username and password -> I didn’t feel like wasting my time so figured this was a dead end.

Then I found the funniest part, the directory /backend/install:

Translates to: Installation Successful Username, Password

Translates to: Installation Successful Username, Password

Okay, good to know, apparently super criminals use the username admin and the password 123456. Think it’ll work on /backend/? Needing to know, I went back to /backend/ and tried the newly discovered credentials 🙂 Sure enough, I was logged in!

Bildschirmfoto 2016-06-10 um 00.59.09

Tadaaaa the beautiful back end of a Paypal phishing service. As you can see in the charts on the bottom, there had been three visitors by the time I accessed the dashboard. I browsed through the dashboard and found a link to „data sets“, which included the phished Paypal credentials, Credit Card Numbers etc.


Bildschirmfoto 2016-06-10 um 01.05.35


At the time, there were three entries, one from me, one of a victim and the first one ever submitted, which could be a test :-). On that note, if you’re developing a site, what’s the first thing you typically do after installing a new service? You test it out. Turns out, the owner of this was stupid to enter some credentials on the website to test the site, but didn’t realize, or care, that his IP address was saved too.  I’ve censored the IP-Address because I  can’t be 100% sure it was the one of him though.

After two hours of monitoring the website, there were a couple of real data sets of German phishing victims.

Phished Data including Username, Password, Birth Date, Credit Card Number, Address and everything else needed for online criminals to ruin someones life

Phished Data including Username, Password, Birth Date, Credit Card Number, Address and everything else needed for online criminals to potentially ruin someone’s life.

Hmm, it seemed as if more and more people were falling for this scam, so I decided to take some action against it… how you ask?

Well 🙂 I included on every place I could the phrase „Your IP is 85.25.*.*“ and went to bed to see what happens next.

Datasets on the left and the Note i left in the middle

Datas ets on the left and the note I left in the middle

Waking up on Friday, the first thing I did was go online see how many more data sets there were… and it turned out.. the site was gone 🙂

Bildschirmfoto 2016-06-10 um 10.05.29

That’s it 🙂 the site is gone and the Russian criminal is now (hopefully) scared that someone recorded his actions and kept evidence of his online identity.

Further Work: 

I know several of my e-mail addresses are likely in a database of phishing targets, as I receive similar emails almost daily. Those scam campaigns are mostly based on the same commercial sold phishing CMS system, my plan is to collect as many phishing sites I can find to test if those are similarly designed.


This post is intended for educational purposes and not meant to promote, incentivize or encourage any action which may or may not be considered illegal. None of the described actions are in any relation with my past, current or future employers.


Q = What happened to the harvested credentials?

A = I contacted the victims via mail (4 at that time) and each of them took care and followed the steps I suggested (change Password, contact Paypal, contact Credit Institute, lock the credit cards). Three out of the four contacted persons had a feeling that something strange was going on on that site but decided not to do anything. By the time I contacted them, they knew something strange was going on, and they were glad I took action and contacted them.



[BugBounty] Sleeping stored Google XSS Awakens a $5000 Bounty

Dear Readers,

Today I want to share a short write-up about a stored cross-site scripting (XSS) issue I found on the Google Cloud Console. I consider it a lucky find. Some of you may remember the tweet I sent to  Frans Rosén  after he discovered a vulnerability on Google Payments:

Screenshot 2016-05-16 at 21:41:38

As it turned out, among the unsuccessful XSS payloads I saved on my Google account, there was one that actually fired. But unexpectedly. When I was originally testing my payloads, I never managed to trigger the execution until recently and inadvertently. But let’s start from the beginning.

As you may know, Google is offering a 60 day  free trial  with a budget of $300 and all that is required is entering (correct) payment information to prove you are not a robot. So, I did that a few weeks ago and started entering XSS payloads in every field I could find … and nothing fired, I failed. A situation I assume we’ve all been in, right? Well, after two months or so, I received an invoice from Google and a notification that my free trial what ending. Wanting to avoid the charges, I quickly logged into my account and deleted my project, which was titled „> <img src = x onerror = javascript: alert (1); … and boom, my payload got Executed!

Screenshot 2016-05-04 at 13:41:51

As it turned out, Google was not filtering the error message once a project which canceled. Astute readers may question why this was not classified as a low level self XSS. This issue was escalated because the Google Cloud Platform can be used by multiple users; if a user creates a project with a malicious XSS payload, that payload could be used against the project administrator to execute malicious javascript (if they delete the project, which seems likely).

For those unfamiliar, and the knowledge hungry, here’s how the payload gets reflected in the content of the site: the first quote and angle bracket, „>“ close the preceding HTML tag which allowed my injected <script> tag to be rendered in the page source. For this POC, I simply used the img src = x payload. Since x is not a valid url, this is designed to fail immediately with a 404 HTTP response, which will then invoke the onerror event to execute a javascript function. However, thanks to @Jobert from HackerOne for noting, that it’s possible this could have returned a 2xx or 3xx redirecting to a 2xx in which case my payload would not have fired. In my testing, I just added the function alert to create a popup but malicious users could have used a cookie stealer  or the Browser Exploitation Framework Project (BEEF) to escalate this issue (I did not have to tell Google did though).

Screenshot 2016-05-04 at 14:06:56

Here’s the video POC I sent in for the Google VRP:

That’s it 🙂


Screenshot 2016-05-16 at 21:51:54



Thanks to Peter  @yaworsk for editing :-)! Follow him and support him by buying his book ! For more technical writeups have a look at ERNW’s  Insinuator  blog, I blog there now and then about Mobile Security and IPv6.

If you have any questions please feel free to contact me at patrik.fehrenbach (at) it-securityguard.com


Geschützt: Digging into the Shopify POS Firmware (Part 1)

Dieser Inhalt ist passwortgeschützt. Um ihn anzuschauen, gib dein Passwort bitte unten ein:


[Research] – Stop OSX Spotlight from sending your location

Hey dear readers it’s been awhile,

tldr; How to avoid apple from yelling out your location to their servers

Bildschirmfoto 2015-06-15 um 00.51.23

I personally like to use the Searchlight function of OSX, it provides me a fast way to access my files – but this it also sends my geolocation to apple everytime i do a search. This blogpost will be about how to disable or at least prevent the built-in search function „Searchlight“ from sending your IP-location to the Apple Servers.

But first have a look at what’s being sent:
GET /search?q=asd&latlng=48.082000,8.640000&geosrc=wifi,155.643824&storefront=143443-4,13&locale=de-DE&time_zone=Europe/Berlin&calendar=gregorian&key=montana4289 HTTP/1.1
Host: api.smoot.apple.com

This request was captured during a Burp session.

What you can see here in the Parameters

  • q is the Query you send to the the Apple Servers
  • latlng is the Latitude and the Longitude of your current (IP) location

I think it’s needless to say that those information should stay private (at least in my opinion). If you want to keep your searches fancy you can stop reading here, preventing the geolocation will change your search experience but you will gain some privacy back 🙂

How To: 

So the first way – Old School /etc/hosts

The /etc/hosts is a local text file that tells the system how to resolve an IP-Address to a Domain. The clue here is to point the Apple domain api.smoot.apple.com to the localhost address of your machine ( this will tell the system to resolve every request to api.smoot.apple.com to the localhost address, thus leading nowhere. To do so:

1. Open a terminal
2. sudo vim /etc/hosts
3. Enter the following line api.smoot.apple.com
4. Clean the DNS cache sudo discoveryutil mdnsflushcache
5. All set 🙂

Bildschirmfoto 2015-06-15 um 00.58.39

Second Way – Little Snitch

I don’t want to promote anything here, but the Little Snitch software is worth buying. Little snitch helps you to organize every incoming and outgoing connection, you can simply add a rule for the Spotlight search:

You want to disable the locationd Service that tries to connect to gs-loc.apple.com – forever  Bildschirmfoto 2015-06-15 um 01.17.29

you are done 🙂 Privacy saved.

If you enjoy this – give me a feedback as a comment here or drop me an email at patrik.fehrenbach(at)it-securityguard.com if you guys are interested i might do a complete writeup about an OSX hardening.