Thursday, January 31, 2013

Backtrack Forensics: safecopy

Menu: Forensics -> Forensic Carving Tools
Directory: /usr/local/bin/safecopy
Official Website: http://safecopy.sourceforge.net/
License: GNU GLP v2

safecopy is a data recovery tool which tries to extract as much data as possible from a seekable but problematic (i.e., damaged sectors) source like floppy drives, hard disk partitions, CDs, etc., where other tools like dd would fail due to I/O errors. It can do multiple runs on a bad disk, first will try to extract easily accessible, error free data, and make not of the bad sectors. On the next run, it will retry the bad sectors multiple time, with better resolution. This is what their default example shows as well:

Safecopy 1.6 by CorvusCorax
Usage: safecopy [options] source target
Options:
--stage1 : Preset to rescue most of the data fast,
using no retries and avoiding bad areas.
Presets: -f 10% -r 10% -R 1 -Z 0 -L 2 -M BaDbLoCk
-o stage1.badblocks
--stage2 : Preset to rescue more data, using no retries
but searching for exact ends of bad areas.
Presets: -f 128* -r 1* -R 1 -Z 0 -L 2
-I stage1.badblocks
-o stage2.badblocks
--stage3 : Preset to rescue everything that can be rescued
using maximum retries, head realignment tricks
and low level access.
Presets: -f 1* -r 1* -R 4 -Z 1 -L 2
-I stage2.badblocks
-o stage3.badblocks
All stage presets can be overridden by individual options.

There are many more options, we can set retries (-R), create a bad block file output (-o), and use it later as an input (-I). We can also mark the bad areas with a special string (-M) instead of zeros, thus we can find those locations more easily later.

As I don't have any damaged media, I just did a simple run:
safecopy /dev/sdc sdc1.img


Backtrack Forensics: EXT3/4 file recovery with extundelete

Menu: Forensics -> Forensic Carving Tools
Directory: /usr/local/bin/extundelete
Official Website: http://extundelete.sourceforge.net/
License: GNU GLP v2

extundelete is a file recovery tool for EXT3 / EXT4 journaling file system. It can recover files by reading the journal log, and the inode tables. The drive where we are recovering files from, should be unmounted in order to work. The restored files will be placed in the "RECOVERED_FILES" directory.

Usage:

It's quite simple. I created a new virtual hard disk for the test, formatted it, and placed 3 files there. I also created an MD5 sum for all.


Then I unmounted it, and restored one file, with:

extundelete --restore-file 'a.txt' /dev/sdc1
and rechecked the MD5 hash, which is matching.


We can restore all files on a drive with:
extundelete --restore-all /dev/sdc1

We can also restore based on the inode number:
extundelete --restore-inode inode_number /dev/sdc1
The restored files, will have an extension of the inode number.

Wednesday, January 30, 2013

Backtrack Forensics: NTFS file recovery with scrounge-ntfs

Menu: Forensics -> Forensic Carving Tools
Directory: /usr/local/sbin/scrounge-ntfs
Official Website: http://thewalter.net/stef/software/scrounge/
License: Open Source BSD type license

This is an NTFS file recovery tool. It will read through all blocks on the disk, and tries to recover all files on the file system. It needs some information in order to work:
  • Start Sector: This is where the partition starts on the disk.
  • End Sector: This is where the partition ends on the disk.
  • Cluster Size: This is the size of one 'block' of data on a partition (in sectors, by default it's 8)
  • MFT Offset: Offset to NTFS Master File Table (in sectors).
Usage:

The tool has an NTFS partition search option, which is not implemented yet. It can also try to detect / guess the values required in the above list, by running:
scrounge-ntfs -l /dev/sda
If it can't be determined you can either go to the disk with a hex editor, or guess. Here is a guide from the author: http://thewalter.net/stef/software/scrounge/guessing.html


Once we have the info, we can start the recovery to an output directory, by running:
scrounge-ntfs -m 2097152 -c 8 -o /root/Desktop/out/ /dev/sda 63 12583809
and as we can see the tool starts to rebuild the files in the correct hierarchy.



Backtrack Forensics: reglookup

Menu: Forensics -> Forensic Analysis Tools
Directory: N/A
Official Website: http://projects.sentinelchicken.org/reglookup/
License: GNU GPL

This will be a very short writing as the tool is not really installed on Backtrack, and I won't deal with it in this case. We get the following error:
sh: reglookup: command not found

I tried to remove it (apt-get remove reglookup) and reinstall it (apt-get install reglookup), but no success. I looked into the downloaded debian package, and apparently it's missing the executable... I hope it will be fixed in the Kali release, and then will revisit this blog entry.

Backtrack Forensics: exiftool

Menu: Forensics -> Forensic Analysis Tools
Directory: /pentest/misc/exiftool/
Official Website: http://www.sno.phy.queensu.ca/~phil/exiftool/
License: GNU GPL v1 or above

exiftool is a perl script, which can extract, and in some files even edit EXIF metadata information. There is an awful lot long list what information can be extracted from these, if you ever worked with any tool, which open JPEG files, you probably saw all of these. Today probably the most sensitive information, might be the location data in the picture. The list of all supported tags, their format, etc... can be found on the official website, it's so long, that could easily fill a book in itself. However not only images supports EXIF info, here is a list of file formats, which the tool supports (r - supports read, w - supports write):

         File Types
------------+-------------+-------------+-------------+------------
3FR r | EIP r | LNK r | PAC r | RWZ r
3G2 r | EPS r/w | M2TS r | PAGES r | RM r
3GP r | ERF r/w | M4A/V r | PBM r/w | SO r
ACR r | EXE r | MEF r/w | PDF r/w | SR2 r/w
AFM r | EXIF r/w/c | MIE r/w/c | PEF r/w | SRF r
AI r/w | EXR r | MIFF r | PFA r | SRW r/w
AIFF r | F4A/V r | MKA r | PFB r | SVG r
APE r | FFF r/w | MKS r | PFM r | SWF r
ARW r/w | FLA r | MKV r | PGF r | THM r/w
ASF r | FLAC r | MNG r/w | PGM r/w | TIFF r/w
AVI r | FLV r | MOS r/w | PICT r | TTC r
BMP r | FPX r | MOV r | PMP r | TTF r
BTF r | GIF r/w | MP3 r | PNG r/w | VRD r/w/c
CHM r | GZ r | MP4 r | PPM r/w | VSD r
COS r | HDP r/w | MPC r | PPT r | WAV r
CR2 r/w | HDR r | MPG r | PPTX r | WDP r/w
CRW r/w | HTML r | MPO r/w | PS r/w | WEBP r
CS1 r/w | ICC r/w/c | MQV r | PSB r/w | WEBM r
DCM r | IDML r | MRW r/w | PSD r/w | WMA r
DCP r/w | IIQ r/w | MXF r | PSP r | WMV r
DCR r | IND r/w | NEF r/w | QTIF r | WV r
DFONT r | INX r | NRW r/w | RA r | X3F r/w
DIVX r | ITC r | NUMBERS r | RAF r/w | XCF r
DJVU r | J2C r | ODP r | RAM r | XLS r
DLL r | JNG r/w | ODS r | RAR r | XLSX r
DNG r/w | JP2 r/w | ODT r | RAW r/w | XMP r/w/c
DOC r | JPEG r/w | OFR r | RIFF r | ZIP r
DOCX r | K25 r | OGG r | RSRC r |
DV r | KDC r | OGV r | RTF r |
DVB r | KEY r | ORF r/w | RW2 r/w |
DYLIB r | LA r | OTF r | RWL r/w |

Not bad, eh? :) Let's see it's very basic usage:

./exiftool - prints the man page
./exiftool -ver - prints the current version


To most simple run, is just specifying the filename, and it will print all the EXIF data information:
./exiftool /root/sample.jpg


We can update any EXIF tag (in case the filetype metadat writing supported), with the -TAG= option. I update the ISO value for testing:
./exiftool -ISO=300 /root/sample.jpg
You can see on the new output that the value has been updated. When we do an update the original file is backed up, with a "_original" string added to the file. In this case it's "sample.jpg_original"


We can also delete a TAG information, if we don't assign any value. The special, ALL tag will delete all tags.
./exiftool -all= /root/sample.jpg
As you can see, now all meta information has been deleted.


There are tons of other options, like exporting/importing tags from CSV, print unsupported tags, print results to a file, recursively process a whole directory, faster processing of JPEG images, and much, much more...

Tuesday, January 29, 2013

Backtrack Forensics: iPhone Analyzer

Menu: Forensics -> Digital Forensics
Directory: /pentest/forensics/iphoneanalyzer
Official Website: http://sourceforge.net/projects/iphoneanalyzer/
License: GNU GPL v3

iPhone analyzer is a tool which can gather lots of data from iPhone backup files. Features:
  • iPhone Backup Browsing
  • Native file viewing (plist, sqlite, etc)
  • Searching including regular expressions
  • ssh access for jailbroken phones (beta)
  • Reports
  • Restore files
  • Recover backups
  • View all iPhone photos
  • examine address book, sms and loads of others
  • find and recover passwords
  • Export files to local filesystem
  • Online and offline mapping
  • Geo track where a device has been
  • IOS5 and earlier versions supported
  • IOS6 is only partially supported (several known problems) - at the time of this writing
Usage:

We can find the iPhone backup files in the following directory:

Windows (prior to Vista):
%user home%\Application Data\Apple Computer\MobileSync\Backup
Windows (Vista and later):
%user home%\AppData\Roaming\Apple Computer\MobileSync\Backup
MAC OS X:
%user home%/Library/Application Support/MobileSync/Backup/

Under this older there will be a 40 character long hex string, which contains the files. I simply copied the backups to a thumb drive, and inserted it to Backtrack. We can open the backups, with selecting:
File -> Open: New Backup Directory


Once the backup is opened, we will see the info.pslist details in the middle. At the top it will show the most relevant information, and at the bottom we can see the full raw content.


On the right side we will see the manifest.pslist contents.


On the left side we have two things: access to the file system, where we can navigate and bookmarks, which takes us to some useful files, like address book, messages, etc...


Once we open a file, it will create a new tab for it, where we can view it. We can close the tab by clicking on "x". We have different views for a file as can be seen in the bottom tabs associated with the file. These can change based on the type of the file we opened.



Bookmarks:


There a very useful part, called "Concepts", which are accessible from the bookmarks. It will gather time, location and name information from the iPhone, and display it on a map. We can select, what information we want to see (call, addressbook entry, image metadata) and it will update the other cells accordingly.


A good guide about the tool: http://www.crypticbit.com/files/ipa_user_guide.pdf

Backtrack Forensics: Memory analysis with volatility

Menu: Forensics -> RAM Forensic Tools
Directory: /pentest/forensics/volatility
Official Website: http://code.google.com/p/volatility/
License: GNU GPL v2

volatility is probably the best open source memory analysis tools. It was written in python, so can be run on any platform, and it supports extension by various plugins, which are python scripts as well, so you can easily create your own. The script supports all major Windows and Linux version (full list on their site) and most major memory dump formats.

Usage:

I will use a WinXP SP2 image for the examples, which I got for testing. It has a rootkit installed for study purposes.

./vol.py -h - this is the help, and will list all the plugins currently available

To display the list of open socktes, found in the memory, run:
./vol.py sockets -f /root/mem/winxp-mem.mdd


to display list of the processes, run - it will check the memory dump with many diffeent methods, and tell us, which method found it, and which doesn't:
./vol.py psxview -f /root/mem/winxp-mem.mdd


to get a list of recently run CLI commands, run (we can see in the results, that the rootkit being installed):
./vol.py cmdscan -f /root/mem/winxp-mem.mdd


to get a list of the found registry hives, run:
./vol.py hivelist -f /root/mem/winxp-mem.mdd


and based on that we can do a hashdump, and we need the location of the system and SAM hives for that. The command will be:
./vol.py hashdump -y 0xe1018378 -s 0xe1496b60 -f /root/mem/winxp-mem.mdd 
where -y specifies the location of the system hive, and -s the location of the SAM hive.

That's all, it has quite a few plugins, so you can play with it to discover more and more.

Monday, January 28, 2013

Backtrack Forensics: volafox

Menu: Forensics -> RAM Forensic Tools
Directory: /pentest/forensics/volafox

Volafox is a Mac OS X memory analysis tool based on volatility. Unfortunately I couldn't get a Mac OS X memory image, so I couldn't really test this. Two images (memory and kernel) should be available here, provided by the author, but the links are not working:

http://forensic.korea.ac.kr/volafox/files/SnowLeopard/MemoryImage.zip
http://forensic.korea.ac.kr/volafox/files/SnowLeopard/mach_kernel.zip

Usage:

In order to get it run we need to remove the first line from the code:
#!c:\python\python.exe
and also give executable permissions:
chmod +x volafox.py

some commands:

volafox.py -i MemoryImage.mem -s mach_kernel -o machine_info - display mac os x version info
volafox.py -i MemoryImage.mem -s mach_kernel -o mount_info - dispaly mounted device info
volafox.py -i MemoryImage.mem -s mach_kernel -o proc_info - process list information
volafox.py -i MemoryImage.mem -s mach_kernel -o proc_info -x [PID] - more info from a process with PID

Here is the help:


Official website: http://code.google.com/p/volafox/
Author's blog: http://forensic.n0fate.com/

Sunday, January 27, 2013

Backtrack Forensics: bulk_extractor

Menu: Forensics -> Forensic Analysis Tools
Directory: /usr/local/bin/bulk_extractor

bulk_extractor is a tool which will search a disk image for regular expressions. It has quite a few pre-defined,  and we can also create our own. We can specify one via the command line, or multiple, which it can read from a file. The tool runs in two phases, first it collects all information from a disk, and after it creates a histogram. It supports raw image (.dd), EnCase (.E01) and AFFLIB (.aff) files or it can be also run directly on the disk. It runs on multiple threads. bulk_extractor will also create a wordlist of all the words that are found in the disk image, which can be used as a dictionary for cracking encryption.

Let's see how to use it:

bulk_extractor -o output_dir image - this will scan the image file, and put all the results in the output directory
bulk_extractor -o output_dir image -j 30 - set threads to 30
bulk_extractor -o output_dir image -j 30 -E pdf - turns off all scanners except pdf
bulk_extractor -o output_dir image -e wordlist - enables wordlist scanner
bulk_extractor -o output_dir image -f 'regex-goes-here' - enables regex scanning, results are written to find.txt
bulk_extractor -h - help

There are quite a few other options, around tuning, enabling / disabling scanners and scanning a directory structure.

Starting the scan (it will run for a few hours):


Results directory, we can see that there are files for each scan type, which will contain all matches of the regex.


Example of the domain file, the number at the left side is the offset:


Official Website: https://github.com/simsong/bulk_extractor

Saturday, January 26, 2013

Backtrack Forensics: rootkit scanning with rkhunter

Menu: Forensics -> Anti-Virus Forensic Tools
Directory: /bin/

rkhunter is a similar tool to chkrootkit, it also scans the system for rootkits, but it is capable a bit more. Let's see, what we can do with it. It will do scans like:

"- MD5 hash compare
- Look for default files used by rootkits
- Wrong file permissions for binaries
- Look for suspected strings in LKM and KLD modules
- Look for hidden files
- Optional scan within plaintext and binary files"


First we can check the version, and also check if there is a newer one:

rkhunter -V - display current version
rkhunter --versioncheck - check if there is an update


then make an update to the current database:

rkhunter --update


Starting the scan is very simple:

rkhunter -c

it will run for about 20-30 minutes, and a couple of times we need to press enter to move forward.


The scan logs (what it printed on the screen, and much more) will be at /var/log/rkhunter.log.

There is one more useful task: we can do a list of SHA1 hashes of some common system files, rkhunter will save it for later, and when it runs the scan, will compare the actual hash with the stored one. If there is a change, it will drop a warning. This can be done with running:

rkhunter --propupd


The hashes are stored in /var/lib/rkhunter/db/rkhunter.dat


Official website: http://www.rootkit.nl/projects/rootkit_hunter.html