Category Archives: Application Firewall

Doing it Cheap, and Right, with Kiwi Syslog Server, SQL and Netscaler Application Firewall

Last week I noted an interesting blog from the guys at Splunk who have developed a way to parse and display Application Firewall blocks and place them into a nice dashboard. Splunk has been doing some interesting stuff here in the last 12 months or so that Citrix Administrators should take note of, especially if they are feeling the pain of real-time monitoring in their Citrix Environment. First off, they hired/contracted Brandon Shell and Jason Conger to work with them. I can tell you that over the years I have had my share of monitoring “tools” shoved down my throat and the majority of them were NETWORKING tools built by NETWORKING companies to support NETWORKING professionals who then tried to retrofit the product to monitor servers.

The Citrix environment alone has its own quarks when it comes to monitoring had having Brandon and Jason on the Splunk team will pretty much ensure that they will build the absolute most relevant monitoring tool out there for supporting Citrix enterprises. While this is not meant to be a glowing endorsement of Splunk it is an endorsement of the two professionals they have hired to build out their Citrix vision.

This article is to cover how I am doing SOME of what splunk is doing at a fraction (almost free) of the cost that you would spend on monitoring products, including splunk. In the last few years I have posted collecting and logging Netscaler syslogs to SQL Server for the purpose of dashboarding VPN Utilization, Endpoint Analysis Scan Results as well as logging Pix Logs to SQL Server via KIWI as well. In this post, I will show you some of what I have been doing for the last few years with my APP Firewall Logs by putting them into KIWI and then writing them to a SQL Server.

Setting up KIWI:

  1. Set up a Filter for KIWI to catch the APP Firewall Logs:


2. Use this Parsing Script

Function Main()
Main = “OK”

Dim MyMsg
Dim Offense
Dim Action
Dim Source

With Fields

Offense = “”
Action = “”
Source = “”

MyMsg = .VarCleanMessageText

 If ( Instr( MyMsg, “APPFW” ) ) Then
OffenseBeg = Instr( MyMsg, “APPFW”) + 6
OffenseEnd = Instr( OffenseBeg, MyMsg, ” “)
Offense = Mid( MyMsg, OffenseBeg, OffenseEnd – OffenseBeg)
end if

If ( Instr( MyMsg, “<blocked>” ) ) Then
Action = “BLOCKED”
End If

If ( Instr( MyMsg, “<not blocked>” ) ) Then
Action = “NOT BLOCKED”
End If

If ( Instr( MyMsg, “<transformed>” ) ) Then
Action = “TRANSFORMED”
End If

If ( Instr( MyMsg, “.” ) ) Then
SourceBeg = Instr( MyMsg, “: “) +3
SourceEnd = Instr( SourceBeg, MyMsg, ” “)
Source = Mid( MyMsg, SourceBeg, SourceEnd – SourceBeg)
End If

.VarCustom01 = Offense
.VarCustom02 = Action
.VarCustom03 = Source
End With

End Function


Set up Custom Data Connectors:

Configure database connection and create the table:

Once you have created the table you should start to see some data come in as the App Firewall blocks IP’s. I used the free version of Netsparker to generate some blocks and ran the following query and got the results below:

While it is not totally visible, the “MsgText” column includes the entire log, this may be necessary as forensic evidence as some jurisdictions require the entire log, unparsed, for evidence.

So John, why SQL and not just Splunk?
I have heard folks say that Splunk is expensive, and it might be but in the realm of monitoring tools I believe it is likely less expensive than most others. For me I needed the data to be portable so that I could cross reference it with different tables. In my case, I usually reference my sources with a GEO-Spatial table as well as with a Malware Blacklist. If you are in the DMZ currently, it is not a bad idea to collect INTEL on who is performing recon scans or probes against your systems. Having the data in a SQL Server allows me to set up stored procedures that will alert me if specific metrics are met. Also, a preponderance of malfeasance can be escalated to your INFOSEC team and you can be much more proactive in blocking hosts. Below is a query I run that references the GEOIP table that I have. I changed my IP Address to an IP Address from China to show how you can cross reference the data.

You can see where a large number of blocks have come from China (well, not really) and this is something you may want to escalate. Usually, hackers are not dumb enough to try something like this. My experience is that you will need to look for things like a consistent delta between probes, that kind of stuff. At any rate, without portability, this would be tough to do with a flat file database although I do believe Splunk has an API that could ease some of this.

Conclusion:
Data portability, for me, is the plumb here, this goes beyond making a pretty graph and moves forward into the long term battle against OWASP top ten in that you can gather INTEL and position yourself to know which IP’s are risky and which IP’s just have some bot or malware on them. Ultimately, if you are not a “SQL-phile” or a programming hack like me this may not be the best option. Folks like Sam Jacobs are a great resource as he is another guy who is very adept at getting Syslogs into SQL. I expect with the additions of Conger and Shell you will see great things come out of Splunk for monitoring Citrix environments. There are a number of new and cool things that they are doing that I need to “get hip” to. If you are struggling with your budget and have some basic SQL Skills, this may be a great option for you to get similar metrics and reporting that you can get with Splunk for a fraction of the price.

I apologize for the delay in posts, I plan on getting at least one post a month up for both xen-trifuge and Edgesight under the Hood.

Take care

John

Advertisements

Xen and the art of Digital Epidemiology

In 2003 I started steering my career toward Citrix/VMWare/Virtualization and at the time, aside from being laughed at for running this fledgling product called ESX Server 1.51, most of my environment was Windows based. There were plenty of shrink-wrapped tools to let me consolidate my events and the only Unix I had to worry about was the Linux Kernel on the ESX Server. Now my environment has included a series of new regulatory framework (Sarbanes, CISP, and currently FIPS 140-2). What used to be a Secure Gateway with a single web interface server and my back end XenAPP farm now includes a Gartner leading VPN Appliance, Access Gateway Enterprise Edition, Load balanced(GSLB) web interface servers, an application firewall and XenApp servers hosted on Linux based XenServer and VMWare. So now, when I hear, “A user called and said their XenAPP Session was laggy where the hell do I begin? How do I get a holistic vision of all of the security, performance and stability issues that could come up in this new environment.

As a security engineer in 2004, I started calling event correlation digital epidemiology. Epidemiology is defined as the branch of medicine dealing with the incidence and prevalence of disease in large populations and with detection of the source and cause of epidemics of infectious disease”

I think that this same principal can be applied to system errors, computer based viruses and overall trends. At the root of this is the ability to collate logs from heterogeneous sources into one centralized database. During this series, I hope to go over how to do this without going to your boss and asking for half a million dollars for an event correlation package.

I currently perform the following with a $245 copy of KIWI Syslog Server:(Integrated with SQL Server Reporting Services)

  • Log all Application Firewall Alerts to a SQL Server and present them via an Operations dashboard This includes violation (SQL Injection, XSS, etc), Offending IP and Time of day.
  • Pull STA Logs and provide a dashboard matrix with the number of users, total number of helpdesk calls, percentage of calls (over 2.5% means we have a problem) and the last ten calls (Our operations staff can see that “PROTOCOL DRIVER ERROR” and react before we start getting calls. )
  • I am alerted when key VIP Personnel are having trouble with their SecurID or AD Credentials.
  • I can track the prevalence of any error, I can tell when it started and how often it occurs.
  • My service desk has a tracker application that they can consult when a user cannot connect telling them if their account is locked out, Key fob is expired or if they just fat fingered their password. This has turned a 20 minute call into a 3 minute call.
  • I have a dashboard that tells me the “QFARM /Load” data for every server refreshing every 5 minutes and it turns Yellow at 7500 and red at 8500 letting us know when a server may be about to waffle.

For this part of Digital Epidemiologist series I will go over parsing and logging STA Logs, why it was important to me and what you can do with them after getting them into a SQL Server.

Abstract:

A few y ears ago, I was asked “What is the current number of external vs internal users”. This involved a very long, complicated query against RMSummaryDatabase that worked okay but was time consuming. One thing we did realize was that every user who accessed our platform externally came through our CAG/AGEE. This meant that they were issued a ticket by the STA Servers. So we configured logging on the STA Servers and realized a few more things. We also got the application that they launched as well as the IP Address of the server they logged into. So now, if a user says they had a bad Citrix experience, we know where they logged in and what applications they used. While Edgesight does most of our user experience troubleshooting for us, it does not upload in real-time and our STA Solution does. We know right then and there.

By integrating this with SQL Server Reporting Services, we have a poor man’s Thomas Koetzing solution where we can search the utilization of certain applications, users and servers.

For this post we will learn how to set up STA Logging, how to use EPILOG from Intersect Alliance to write the data to a KIWI Syslog Server and then we will learn how to parse and write that to a SQL Server and use some of the queries I have included to gain valuable data that can eventually be used in a SQL Server Reporting Services report.

Setting up STA Logging:

Go to %systemroot%\program files\Citrix\system32 and add the following to the ctxsta.config file:

LogLevel=3
MaxLogCount=10
MaxLogSize=55 (Make sure this size is sufficient).

LogDir=W:\Program Files\Citrix\logs\

In the LogDir folder you will note that the log files created will be named sta2009MMDD.log

What exactly is in the logs:
The logs will show up in the following format: (We are interested in the items in bold where a parse script will pipe them into a database for us. )

INFORMATION 2009/11/22:22:29:32 CSG1305 Request Ticket – Successful. ED0C6898ECA0064389FDD6ABE49A03B9 V4 CGPAddress = 192.168.1.47:2598:localhost:1494 Refreshable = false XData = <?xml version=”1.0″?><!–DOCTYPE CtxConnInfoProtocol SYSTEM “CtxConnInfo.dtd”–><CtxConnInfo version=”1.0″><ServerAddress>192.168.1.47:1494</ServerAddress><UserName>JSMITH</UserName><UserDomain>cdc</UserDomain><ApplicationName>Outlook 2007</ApplicationName><Protocol>ICA</Protocol></CtxConnInfo> ICAAddress = 192.168.1.47:1494

Okay, so I have logs in a flat file….big deal!

The next step involves integrating them with a free open source product called “Epilog” by this totally kick ass company called intersect alliance (www.intersectalliance.com). We will configure epilog to send these flat files to a KIWI syslog server.

So we will go to the Intersect Alliance Download site to get epilog and run through the installation process. Once that is completed you will want to configure your epilog agent to “tail-and-send” your STA Log Files. We will do this by telling it where to get the log file and who to send it to.

After the installation go to START->Programs->Intersect Alliance-> Snare/Epilog for Windows

Under “LOG CONFIGURATION” For STA logs we will use the log type of “Generic” and we will type in the location of the log files and we will tell Epilog to use the format of STA20%-*.log

After configuring the location of logs and type of logs you will want to go to “Network Configuration” and type in the IP Address of your Syslog Server and select port 514 (Syslog users UDP 514).

Once done, go to “Latest Events” and see if you see your syslog data there.


Section III: KIWI SYSLOG SERVER

I assume that most Citrix engineers have access to a SQL Server and since Epilog is free, the only thing in this solution that costs money is KIWI Syslog Server. A whopping $245 in fact. Over the years a number of event correlation solutions have come along, in fact I was at one company where we spent over $600K on a solution that had a nice dashboard and logged files to a flat file database (WTF? Are you kidding me?!). The KIWI Syslog Server will allow you to set up ten custom database connectors and that should be plenty for any CItrix administrator who is integrating XenServer, XenAPP/Windows servers, Netscaler/AGEE, CAG 2000 and Application firewall logs into one centralized database. While you need to have some intermediate SQL Skills, you do not need to be a superstar and the benefits of digital epidemiology are enormous. My hope is to continue blog posts on how I use this solution and hopefully you will see benefits beyond looking at your STA logs.

The first thing we need to do is add a rule called “STA-Logs” and filter for strings that will let KIWI know that the syslog update is an STA Log. We do so by adding two filters. The first one is stating “GenericLog”

The second filter is “<Username>”. The two of these filters will match STA syslog messages.


Now that we have created our filters, it’s time to perform actions. There are two actions we want to perform. We want to parse the script (pull all of the data that was bolded from the log text above) and write that data to a table in a database. You add actions by right-clicking action and selecting “Add Action”

So our first “Action” is to set up a “Run Script” action. I have named mine “Parse Script”.

Here is the script I use to parse the data (Thank you Mark Schill (http://www.cmschill.net/) for showing me how to do this.)

The Script: (This will scrub the raw data into the parts you want, click “Edit Script” and paste).

##############################
Function Main()

Main = “OK”

Dim MyMsg

Dim Status

Dim UserName

Dim Application

Dim ServerIP

With Fields

Status = “”

UserName = “”

Application = “”

ServerIP = “”    

MyMsg = .VarCleanMessageText

If ( Instr( MyMsg, “CtxConnInfo.dtd” ) ) Then

Status = “Successful”

UserBeg = Instr( MyMsg, “<UserName>”) + 10

UserEnd = Instr( UserBeg, MyMsg, “<“)

UserName = Mid( MyMsg, UserBeg, UserEnd – UserBeg)

AppBeg = Instr( MyMsg, “<ApplicationName>”) + 17

AppEnd = Instr( AppBeg, MyMsg, “<“)

Application = Mid( MyMsg, AppBeg, AppEnd – AppBeg)

    
 

SrvBeg = Instr( MyMsg, “<ServerAddress>”) + 15

SrvEnd = Instr( SrvBeg, MyMsg, “</”)

ServerIP = Mid( MyMsg, SrvBeg, SrvEnd – SrvBeg)

End If

.VarCustom01 = Status

.VarCustom02 = UserName

.VarCustom03 = Application

.VarCustom04 = ServerIP

End With

##############################

Now that we can parse the data we need to create a table in a database with the appropriate columns.

The next step is to create the field format and create the table. Make sure the account in the connect string has DBO privileges to the database. Set up the custom field format with the following fields. Ensure that the type is SQL Database.


As you see below, you will need to set up an ODBC Connection for your Syslog Database and you will need to provide a connect string here (yes…in clear text so make sure you know who can log onto the syslog server). When you are all set click “Create Table” and click “Apply”


Hopefully once this is done, you will start filling up your table with STA Log entries with the data from the parse script.

I have included some helpful queries that have been very useful to me: You may also want to integrate this data with SQL Server Reporting Services and with that, you can build a poor man’s Thomas Koetzing tool.

Helpful SQL Queries: (Edit @BEG and @END values)

 

How many users for each day:(Unique users per day)

declare @BEG datetime
declare @END datetime
set @BEG = ‘2009-11-01’
set @END = ‘2009-11-30’
select convert(varchar(10),msgdatetime, 111), count(distinct username)
from sta_logs
where msgdatetime between @beg and @end
group by convert(varchar(10),msgdatetime, 111)
order by convert(varchar(10),msgdatetime, 111)

Top 100 Applications for this month:

declare @BEG datetime
declare @END datetime
set @BEG = ‘2009-11-01’
set @END = ‘2009-11-30’
select top 100 [application], count(application)
from sta_logs
where msgdatetime between @beg and @end
group by application
order by count(application) desc

Usage by the hour: (Unique users for each hour)

declare @BEG datetime
declare @END datetime
set @BEG = ‘2009-11-01’
set @END = ‘2009-11-02′
select convert(varchar(2),msgdatetime,108)+’:00′, count(distinct username)
from sta_logs
where msgdatetime between @beg and @end
group by convert(varchar(2),msgdatetime,108)+’:00′
order by convert(varchar(2),msgdatetime,108)+’:00′

Electronic Stimulus

According to the Baltimore Sun, President Obama has promised to spend $50 billion dollars over the next five years coax hospitals, medical centers and the like to begin the process of offering electronic data.  So nurses, occupational therapist and other allied health personnel as well as Doctors may be carrying something like a Kindle around instead of a clip board.  With this comes an exstension of their existing regulatory framework such as HIPPA, CISP (as no one gets away from a visit to the Doctor without putting the plastic down these days) and future restrictions that will be put in place as a result of pressure from Libertarians and ACLU members. 

Ensuring that none of my personally identifiable information is left on someone’s screen while they walk away from their PC is a very big concern.  As these systems are brought online, ensuring that the data is protected, not so much from hackers, but also from basic behavioral mistakes that could result in someone leaning over a counter and getting my date of birth, social security number and credit card number.

While my security experience is very limited with HIPPA I can say that keeping this information hidden from the wrong eyes is a basic function of any security endeavor.  How vendors, System Integrators and IT personnel can best bridge this gap could have a direct correlation on how successful they are in this space.  How much of that $50 billion over five years will go to IBM? EDS/HP? Perot Systems?  What have you done to show these Systems Integrators as well as smaller partners how your product will help them meet this challenge and how will you deal with a security screw that seems to only get tightened?  Fact is, there are millions and millions of medical documents, and finding out which parts of which documents contain sensitive data is virtually impossible.  One solution is to pattern-match the data and block it so that it is not visible to the wrong people.  You could do this with a DBA who ran ad hoc queries to match the data and replace it with an “X” but then someone in billing may need that data (keep two copies?) not to mention the staggering cost (Y2K Part 2?).  The best way I can think of is to place the data behind a device that can capture the patters in the header and “X” the data out in real time.  Enter the Netscaler Platinum that will not only add compression, authentication, caching and business continuity, but will keep the wrong people from seeing the wrong data.  I am not sure when the money will start flowing but as I understand it, some hospitals having as much as $1.5 million dangled in front of them to meet this challenge.      

In this lab, I present how I used the Netscaler Platinum Application Firewall feature to secure personally identifiable data with a rule called “Safe Object” as well as how to deal with a zero day worm/virus using the “Deny URL” Rule.  This “Safe Object” feature, when coupled with the Netscaler policy engine, will allow you the flexibility to ensure that certain job types (Nurses, Doctors, etc) based on either login (setting authentication on the VIP) or subnet; do not see things like Social Security Numbers, Credit Cards and other sensitive data.  While at the same time, ensuring that information is available to billing and accounts receivable personnel. 

Materials:

For this lab, I used a basic Dell 1950 G6 with a virtualized Netscaler VPX that functioned as a VPN allowing me to establish a secure tunnel to the sensitive data on a non-wired network that resided on that server.  An Apache server on the non-wired network with bogus phone numbers and social security numbers was used as the back end web server.  Again, in a real world scenario, you could either hypervise your web server and place it on a non-wired network as covered in my “VPX Beyond the lab” blog or you could ACL off your web server so that only the MIP/SNIP of the Netscaler was allowed to access your web content. 

See the lab here:
http://citrix.utipu.com/app/tip/id/11733/