Tag: SQL

Web Security Fundamentals by Varonis

Some days back, i received an invitation to attend an online course by Varonis on Web Security Fundamentals which has been conducted by Troy Hunt. I should say though this course is for beginners, its worth watching and pretty interesting. Troy Hunt is a security developer and author of PluralSight tutorials. You can join the course at Web Security Fundamental on Varonis website.

Some world biggest data breach examples were enumerated at the beginning of the mini course followed by some statistics and the impact of web security. The following points were enumerated:

  1. SQL Injections
  2. Insufficient Transport Layer Security
  3. Insecure Password Storage
  4. Cross site Scripting (XSS)
  5. Weak account management

The course composed of the impact of the risk, how it works with examples and demos as well as defense techniques to be used to strengthen  the system.

SQL Injection

screenshot-from-2016-10-15-09-38-54
Photo credits: varonis.com

An example given using tool such as Havij to automate SQL injection. This tool is a GUI pretty straight forward such as to enter the URL, followed by the tables, columns etc.. to retrieve information from a database. This will result in leaking of information from a website is same is not secured. Several ways to defend against SQL injection attack is to :

  • Validate untrusted data – Has the user provided valid input to the system?
  • Parameterize queries – Seperate the query and the data
  • Lock down the Database permission – Apply the ‘principle of least permission’
  • Apply ‘Defence in depth’ – Web application firewall and cryptographic storage

 Transport Layer Security

screenshot-from-2016-10-15-09-56-00
Photo credits: varonis.com

This part was elaborated on the lack of encryption on network layer such as missing HTTPS security, especially how the risk manifest. An example of a key logger was used to retrieve information from a web page. Defense of such type of attacks were emphasized on the following points:

  • Apply TLS – Literally apply TLS to encrypt by default
  • Strengthen TLS – Ensure it is a strong implementation of TLS
  • Lock down Application – Use construct that disallow  communication over insecure connections.
  • Apply the same control internally – Attacks on the Transport layer can occur behind firewall too

Insecure Password Storage

screenshot-from-2016-10-15-10-07-12
Photo credits: varonis.com

Encryption and Decryption mechanism need to be mastered at this level as this is the basic concept of preventing attacks on insecure password storage. An example of brute force attack was demonstrated. One of the tool is Hashcat which was used as proof of concept. To prevent such types of attacks:

  • Always hash and never encrypt – This work on the assumption that the entire system may be compromised.
  • Choose the right algorithm – Get the balance between workload and performance right.
  • Enforce password rules – Stronger password are significantly harder to crack.
  • Encourage strong password – Do not place arbitrary limits on password strength

Cross-Site Scripting (XSS)

A demo was shown on this aspect using a “search” example on a website search engine. The aim is to search  mechanism that can be exploited.

screenshot-from-2016-10-15-10-18-04
Photo credits: varonis.com
  • Defense against such type of attacks were on the following points:
  • Validate untrusted data – Has the user provide valid input to the system?
  • Always Encore output – Ensure that any reflected input is rendered in the browser
  • Encode for the correct context – HTML / HTML attribute / CSS / JavaScript are all different
  • Protecting cookies – Flag cookies as ‘http only’ so they cannot be accessible by client script.

Weak Account Management

screenshot-from-2016-10-15-10-33-15
Photo credits: varonis.com

To manage weak accounts, the following factors need to be taken into consideration:

  • Poor password rules
  • Lack of brute force protection
  • Insecure ‘remember me’ feature
  • Vulnerable password change feature
  • Enumerable password resets

Here are some tips against account enumeration attacks:

  • Always respond identically – Return the same message to anonymous users
  • Use email for verification – Email the address and confirm or deny account existence there
  • Consider other enumeration vectors – Login and registration are other common channels for disclosure.
  • Consider the risk in context – Different application have different levels of privacy expectation.

To resume, its important to grasp the fact that good security is ‘defense in depth’. Security needs to be considered in the context of cost as well as usability as many of these attacks provide vectors into the internal network. Security goes well beyond. The tutorial ensures that questions are being asked at all levels to ensure security such as:

  • Is access to data logged and auditable?
  • Do you have visibility to resource accessible via access controls?
  • How many of these permissions excessive?
  • Is anyone actually reviewing entitlements?
  • How are you prioritizing security efforts?

 

MySQL Binlogs Analysis for data loss

Some days back, I encountered a server where it happens that some data was altered in the database. A quick report can be generated with the mysqlbinlog command.

Photo Credits: Mysql.com
Photo Credits: Mysql.com

The MySQL bin logs contains “events” that describe database changes such as table creation operations or changes to table data. It also contains events for statements that potentially could have made changes (for example, a DELETE which matched no rows), unless row-based logging is used. The binary log also contains information about how long each statement took that updated data. The binary log has two important purposes:

  • For replication, the binary log on a master replication server provides a record of the data changes to be sent to slave servers. The master server sends the events contained in its binary log to its slaves, which execute those events to make the same data changes that were made on the master. See Section 18.2, “Replication Implementation”.
  • Certain data recovery operations require use of the binary log. After a backup has been restored, the events in the binary log that were recorded after the backup was made are re-executed. These events bring databases up to date from the point of the backup. See Section 8.5, “Point-in-Time (Incremental) Recovery Using the Binary Log”.

It is to be noted that by enabling My SQL bin logs, servers will tend to react more slowly though the benefits are really useful. My SQL bin logs should not be deleted straight forward with a rm -f command but rather with the command PURGE BINARY LOGS TO ‘mysql-bin.111’;

In this article i will demonstrate some command to strip out interesting information from a Binary log file. Logs that can alter information in a database are classified with the following database requests such as UPDATE, DELETE, INSERT, DELETE, REPLACE and ALTER.

1. If you want to read the whole content of a binlog, use the following command. This will comprise of all request made including selects statement.

mysqlbinlog binlog.1111

2.Let’s say you have a list of binlogs and you want to find all the ALTER carried out only for a specific database called “question”

mysqlbinlog binlog.* | grep -i -e "^alter" | grep -i -e "question" >> /tmp/alter_question.txt

3. Let’s say you want to find the date for 03/05/2016 of all alter commands carried out from the file generated from part 1

grep -i -A 3 '#160503' /tmp/alter_question.txt | less

4. If you want to extract all ALTER from a bunch of binlogs for a specific database (question)

mysqlbinlog --database=question binlog.* | grep -B 5 -i -e "^alter" >> /tmp/alter_question.txt

5. You might also want to retrieve information from a specific date and time.

mysqlbinlog --start-datetime="2016-2-02 5:00:00"--stop-datetime="2016-03-03 8:10:00" mysql-bin.000007

However, analysis of the Mysqlbin logs are pretty vast. It depends what are the information that is being needed. I also find out that the mk-query-digest is also an interesting tool to extract information and perform analysis. SEE http://linux.die.net/man/1/mk-query-digest