Douglas Berdeaux has written an excellent book, excellent from quite a number of points of view, some of which I will address. Packt Publishing have done a great service making this and other available at their web site. It is one of many technical books there that have extensive source code and are good 'instructors'.
It is one of over 2000 instructional books and videos available at the Packt web site.
I read a lot on my tablet but most of the ebooks I read are "linear text" (think: 'novels', 'news'). A book like this is heavily annotated by differentiating fonts and type and layout. How well your ebook reader renders that might vary. None of the ones I used were as satisfactory as the PDF. For all its failings, if you want a page that looks "just so" whatever it is read on, then PDF still wins out. For many, this won't matter since the source code can be downloaded in a separate ZIP file.
Of course you may be like me and prefer to learn by entering the code by hand so as to develop the learned physical habit which you can then carry forward. You may also prefer to have a hard copy version of the book rather than use a 'split screen' mode.
This is not a book about learning to code in Perl, or earning about the basics of TCP/IP. Berdeaux himself says in the introduction:
This book is written for people who are already familiar with
basic Perl programming and who have the desire to advance this
knowledge by applying it to information security and penetration
testing. With each chapter, I encourage you to branch off into
tangents, expanding upon the lessons and modifying the code to
pursue your own creative ideas.
I found this to be an excellent 'source book' for ideas and worked though many variations of the example code. This book is a beginning, not a end point.
That was Then, This is Now
Basing a criticism on a 'penetrate and patch' view of pen-testing is, of course rather biased. So is basing it on the idea that these are tools for malicious hackers. That has long since not been the case. Today, penetration testing is a technique approved by the financial community as part of the PCI:DSS certification.
In one sense its not 'penetrate and patch' so much as a classical Red-team. Blue team codes; red team debugs by breaking the code. A quite acceptable approach to software development. Manufacturers crash car to prove their safety. Most materials are 'stress-tested' to ensure they won't break during normal and even exceptional use. Pen-testing to prove correctness and compliance and resilience is perfectly valid.
Who This Book is For
Douglas Berdeaux has chosen to take the reader into the dirty byte-level depths of cracking WPA2, packet sniffing and disassembly, ARP spoofing (the right way), and performing other advanced tasks, such as blind and time-based SQL injection. Parts of the book are Perl code that mimicked the functionality of other information security programs, so one can see how it all fits together. Although Perl was originally about scanning text and building reports, this is something quite different, dramatically different, and shows what Perl is really capable of.
It wasn't until several years prior to writing this book that
I truly began to understand the harmonious nature of Perl,
Linux, and information security. Perl is designed for string
manipulation, which excels in an operating system that treats
everything as a file. Rather than writing Perl scripts to parse
the output from other programs, I was now writing independent
code that mimicked the functionality of other information
security programs. At this stage, I had a newfound appreciation
for the power of Perl, which opened the door for endless
opportunities, including this book.
I myself adopted Perl in 1989 with version 3 and built a complete ISP management, tracking and reporting/billing system. I found that it had all the expressive power of C but handled many matters such as string manipulation and patter matching much more gracefully. And then there was the CPAN repository! Perhaps I should fault Berdeaux for not emphasising CPAN more, but to be fair, CPAN deserves a book of its own and is a living, growing subject.
This is certainly a 'how-to' book and Berdeaux makes it quite explicit that the examples and exercises are for the real world. He suggests a test-bench with a 802.11 Wi-Fi router that is capable of WPA2 encryption, two workstations (which can be virtual if networked properly) that will act as an attacker and a victim, a smartphone device, an 802.11 Wi-Fi adapter that is supported by the Linux OS driver for packet injection, network shared storage, and a connection to the Internet.
All that being said, the book is an excellent example of how to design, write and document open source code. So much open source code is just presented and difficult to understand or support. The author has not documented his design decisions, not documented what the various code sections are trying to do and how that way of doing it rather than another was chosen. In the literary world we often have early manuscript that show revisions, author's notes and such like. All too often with code we only see the end result. Berdeaux unfolds all this and the result is very readable. This is the kind of book that could be used on a course on either Perl or Pen-testing because it is practical and will engage the student's interest.
Chapter 1 takes you though the basics of Perl and ends up discussing CPAN And showing you how to download
LWP::UserAgent, which plays a key role in the code examples that follow. Readers already familiar with Perl can page though this quickly.
Chapter 2 deals with shell programming under Linux using BASH. Again the basics are covered and those with shell experience can move on quickly.
The only parts of importance to those with experience is some setup of the environment for what follows.
Chapters 3 and 4 deal with the wired environment before going on to the wireless environment in chapter 5.
Chapter 3 is basically about replicating the functionality of
NMAP using Perl. While this seems trivial it introduces many concepts and tools that will be used later. Using them in this context makes them more visible and understandable than simply blindly using them later with no explanation, and also shows how Perl can be used for network functions.
Along the way we meet many other network tools available under Linux:
arp. Of course it helps if you know the basics of how the TCP handshake works and of course the Ethernet, IP, and TCP layers.
Chapter 4 addresses packet capture and filtering with Perl. Those who have a thorough knowledge of the TCP protocol suite and tools such as
wireshark can move quickly though this as much of the first part of this chapter is how to take packets apart using Perl. We then move on to the application layer and so into how to set up a "Man in the Middle" (MitM) attack. Berdeaux emphasises the importance of information gathering and uses the example MAC/IP address determined earlier to illustrate a hijacking with ARP spoofing.
In Chapter 5 we move on to Wifi networking with 802.11 and how to disassemble 802.11 frames. A more detailed knowledge how 802.11 is managed is the basic knowledge requirement here, though Berdeaux does cover what is needed for his examples. Once again a variety of Linux networking tools, this time the wifi tools, are used alongside or within the Perl code.
Having laid these foundations Chapter 6 moves on to applying these skills in the first state of a penetration test, the gathering of information, in this case Open Source information (OSINT) such as email addresses and DNS information. This covers not only obviously googling but also searching social media sites such as Google+, LinkedIn, Facebook and others. This section shows the power of Perl's regular expression mechanism to filter out the desired information from what might be a 'fire-hose' of results. As humans we look at only the first few results of a google query, probably not noticing that there are many thousands of hits. A Perl based scanner can dive deeper.
Chapter 7 goes into detail about the powerful hack SQL Injection, making the point that SQL injection is one of the longest-running vulnerabilities in IT, only bettered by Buffer Over-run. It is a demonstration that some web technologies, including languages, are inherently fragile and simple mistake can have dramatic consequences.
Chapter 8 looks at other web based vulnerabilities and how to exploit them, such as cross site scripting (XSS), file inclusion and others. Berdeaux makes these all very clear and simple and shows how Perl really is an easy to use tool, a hackers "Swiss Army Knife".
Chapter 9 deals with password cracking. While Perl isn't as fast as lower level languages for brute force cracking, Berdeaux makes the very valid point that there are better ways, making use of precomputed tables and of leaked information, the Internet equivalent of the yellow stickie under the mousepad. Once again google comes into play. In one sense this book is as much about using google as it is about Perl!
It is in the section on WPA - wifi - cracking that Berdeaux makes Perl shine. He begins with a clear explanation of the protocol and then carefully explains the code and how it works. He makes it look very simple and straight forward, an excellent piece of writing about something that can be very confusing.
"Metadata", addressed in Chapter 10 has been in the news recently due to revelations about national security agencies collecting communication information. "Metadata" refers to the contextual information rather than the actual content, the who, where, what. The metadata of a photograph can reveal where and when it was taken, how it was edited. How this information can be exploited is going to depend on the context. One might imagine law enforcement using metadata to trace child pornographers.
Files other than photographs also contain metadata. Examples include many of the types of documents stored on web sites and that can be found by searching with google and specifying the filetype. One of the most common of these is PDF, and Berdeaux uses this as an example too.
In a general sense, metadata is an interesting form of information 'leakage' simply because it is not visible. An "out of sight means out of mind" phenomena, added to that fact that many people are simple ignorant of its existence.
Chapter 11 deals with Social Engineering, and as Berdeaux says, that's about psychology:
Human psychology and the human nature of will, sympathy, empathy,
and the most commonly exploited nature of curiosity are all weaknesses
that a successful social engineer can use to elicit private information.
Some institutions and most large-scale client targets for penetration
testing are slowly becoming aware of this type of attack and are employing
practices to mitigate these attacks. However, just as we can sharpen our
weapons to clear through defense, we can also sharpen our social engineering
skills by investing our efforts in the initial OSINT, profiling and gathering
information, which we have already done throughout the course of this book.
This kind of penetration method relies on bait messages. It is effective because it so often works when all else fails. Ultimately it relies on human frailty and trust. Berdeaux makes it clear that a great deal of our trust is in the integrity of the programs we use. He uses the example of a rogue version of SSH written in Perl to make this point
Chapter 12 deals with the most important part of any penetration test operation: the reporting of the results. A quality report ensures a satisfied client. As Berdeaux says:
The process of planning the reports begins the minute we begin testing and
ends the minute we stop.
He goes on to add
Logging successful steps is not only crucial for the logistics of the target
client, but can also lead to further exploitation after close analysis.
Perl is admirable suited to generating and tabulating reports, it was part of its original design concept. Along the way, Perl has seen the development of many code modules and has been the core of the engine behind many web sites.
That has led to facilities for things such as graphing, which are used in the examples here.
The final report might be presented as a PDF or as a web page (HTML). Perl can handle both and both are illustrated. These techniques, obviously, have a wider application.
Finally in Chapter 13 we learn how to write GUIs in Perl with the "Tk" extensions. Along the way we have to learn the Object Oriented syntax of Perl.
If I had been structuring this book I would have introduced the OO-Perl much earlier and make use of the GUI capabilities much earlier. At the very least, the techniques of OO-Perl and call-backs that this chapter introduces are more generally applicable.
GUIs can be wonderful things, or they can be limiting things. It is up to the GUI designer. The point of this chapter is that you can be the GUI designer and have an interface that meets your needs.
|Title||Penetration Testing with Perl|
|Publisher||Packt Publishing (http://www.packt.com)|
|Address||Livery Place, 35 Livery Street, Birmingham B3 2PB, UK|
|Price||$26.99 ebook, $44.99 print+ebook|
My digital camera uses exif to convey a vast amount of contextual information and imprint it on each photo: date, time, the camera, shutter, aperture, flash. I have GPS in the camera so it can tell the location, elevation. The exif protocol also allows for vendor specific information and is extensible and customizable.
Unless and until we have an 'exif' for IoT its going to be lame and useless.
What is plugged in to that socket? A fan, a PC, a refrigerator, a charger for your cell phone? What's the rating of the device? How is it used? What functions other than on/off can be controlled?
Lame lame lame lame.
At the very least, this will apply a 'many eyes' to some of the SSL code and so long as the ssh pruning isn't wholesale slash-and-burn that cutting it back may prove efficacious for two reasons.
Less code can be simpler code, with decreased likelihood of there being a bug due to complexity and interaction.
Getting rid of the special cases such as VMS and Windows also reduces the complexity.
POSIX I'm not sure about; in many ways POSIX has become a dinosaur. Quite a number of Linux authors have observed that if you stop being anal about POSIX you can gt code that works and a simple #ifdef can take care of portability. In the 90% case there isn't a lot of divergence between the flavours and in the 99% case the #ifdef can take care of that.
Whether SSH fits into the 90% or the 99% I don't know. The APIs for 'random' and 'crypto' are in the grey areas where implementations differ but also one where POSIX seems to be the most anal and 'lowest common denominator'. I suspect that this is one where the #ifdef route will allow more effective implementations.
We shall see what emerges, but on the whole the BSD team have a reputation for good security practices so I'm hopeful about the quality.
I'd be interested to see their testing approach.
He makes the case that once you put a computer in something it stops being that something and becomes a computer.
Camera + computer => computer
The latest intelligence on Al-Qaeda, a high profile Child Protection
report and plans for policing the London 2012 Olympics; three very
different documents with two things in common: firstly, they all
contained highly confidential information and secondly, they were all
left on a train.
Or maybe "Strangers on a Train"
Our latest research reveals that two thirds of Europe’s office commuters
have no qualms about peering across to see what the person sitting next
to them is working on; and more than one in ten (14 per cent) has
spotted confidential or highly sensitive information.
Perhaps that's cynical and pessimistic and a headline grabber, but then that's what makes news.
What I’m afraid of is that things like this set a low threshold of expectation, that people will thing they don't need to be better than the herd.
Based on the demonstrated persistence of their enemies, I have a lot of respect for what Israeli security achieves.
Back to Verb vs Noun.
His point about baggage claim is interesting. It strikes me that this is the kind of location serious terrorists, that is the ones who worked
in Europe through the last century, might attack: not just dramatic, but shows how ineffectual airport security really is. And what will the TSA do about such an attack? Inconvenience passengers further.
So what's the best file system to use for archiving and data storage rather than the normal usage?
Won't that depend on ...
a) the nature of the archive files
If this is simply to be a 'mirror' of the regular file system, a 1:1
file mapping then there is no need for some specific optimizations as there would be if, for example, each snapshot were to be a single large file, a TAR or CPIO image say. You then have to look at what you are archiving: small files, large files .... Archiving mail a mbox is going to be different from archiving as maildir. For example the later is going to consume a LOT of inodes and that affects how one would format a ext2, est3 r ext4 file system but not be relevant on a ReiserFS or BtrFS.
b) the demand for retrieval from the archive
This is actually a can of worms. You might not think so at first but I've seen businesses put out of service because their 'backup' indexing was inadequate when the time came to retrieve a specific archive file of a specific date, as oppose to restore the whole backup. You need to be driven by your business methods here and that in turn will determine your indexing and retrieval which will determine your storage format.
Its business drive, not technology driven. Why else would you be archiving?
Now while (b) is pretty much an 'absolute', (a) can end up being flexible. You HAVE to have a clear way of retrieval otherwise your
archive is just a 'junk room' into which your file system overflows.
That (a) can be flexible also means that the optimization curve is not clearly peaked. Why else would you be asking this question? What's the worst situation if you choose ReiserFS rather than extN? The size of the file system? The number of inodes?
But if your indexing broken or inadequate you've got a business problem.
An article on Linked entitled 'The Truth about Practices" started a discussion thread with some of my colleagues.
The most pertinent comment came from Alan Rocker:
I'm not sure whether to quote "Up the Organisation", ("If you must have a policy manual, reprint the Ten Commandments"), or "Catch-22" (about the nice "tidy bomb pattern" that unfortunately failed to hit the target), in support of the article. Industry-wide metrics can nevertheless be useful, though it's fatal to confuse a speedometer and a motor.
However not everyone in the group agreed with our skepricism and the observations of the autor of the article.
And Anton aren't the controls you advocate so passionately best practices? >
NOT. Make that *N*O*T*!*!*! Even allowing for the lowercase!
So I need to compile a list of ALL assets, information or otherwise,
That leads to tables and chairs and powerbars.
OK so you can't work without those, but that's not what I meant.
Physical assets are only relevant in so far as they part of information processing. You should not start from those, you should start from the information and look at how the business processes make use of it. Don't confuse you DR/BC plan with your core ISMS statements. ISO Standard 22301 addresses that.
This is, ultimately, about the business processes.
I often explain that Information Security focuses on Information Assets.
Some day, on the corporate balance sheet, there will be an entry
which reads, "Information"; for in most cases the information is
more valuable than the hardware which processes it.
-- Adm. Grace Murray Hopper, USN Ret.
Some people see this as a binary absolute - they think that there's no need to asses the risks to the physical assets or that somehow this is automatically considered when assessing the risk to information.
The thing is there are differing types of information and differing types of containers for them.
I get criticised occasionally for long and detailed posts that some readers complain treat them like beginners, but sadly if I don't I get comments such as this in reply
Data Loss is something you prevent; you enforce controls to prevent data
leakage, DLP can be a programme, but , I find very difficult to support
with a policy.
Does one have visions of chasing escaping data over the net with a three-ring binder labelled "Policy"?
Let me try again.
Policy comes first.
Without policy giving direction, purpose and justification, supplying the basis for measurement, quality and applicability (never mind issues such as configuration) then you are working on an ad-hoc basis.
On the ISO2700 forum one user gave a long description of his information gathering process but expressed frustration over what to do with it all all, the assets, the threats and so forth, and trying to make it into a risk assessment.
It was easy for the more experienced of us to see what he was missing.
He was missing something very important -- a RISK MODEL
The model determines what you look for and how it is relevant.