Configuring SQL Server destinations
You can use SQL Server destinations to store historical results for analysis and integration with third-party systems. You can forward the results from saved questions and other data sources to a database table.
- Microsoft SQL Server 2008 or later, Microsoft SQL Server 2008 Express or later
- Database application server must be accessible from the Tanium Module Server
User credentials with db_ddladmin and db_datawriter role membership to an account on the Microsoft SQL Server that can insert data, create tables (optional), and run stored procedures (optional)
Decide whether to use an existing database or create a new database.
- Get the name of the database that you want to use for Connect. When you set up the destination, you can have Connect create a new table for your data or use an existing table.
- The user account that Connect uses for the database must have log in permissions to that database and the appropriate permissions to tables and stored procedures (optional) within that database.
Both SQL Server Authentication and Windows Authentication are supported. Set up the following privileges:
- For each user, create a User Mapping for the database.
- For each database, grant the CREATE TABLE permission.
- For each schema, grant the ALTER and INSERT permissions.
- (Optional) For stored procedures, grant at least the EXECUTE permission.
- (Optional) Create stored procedures.
You can create procedures that run before the first database table row is inserted or after the last row is inserted. Before you create the connection, these procedures and any associated parameters must exist in your database. The user account for Connect also must have permission to run these procedures.
GO /****** Object: Table [dbo].[connection_log] Script Date: 4/25/2017 8:02:32 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[connection_log]( [connectionRunId] [bigint] NOT NULL, [connectionId] [bigint] NOT NULL, [connectionName] [nvarchar](200) NOT NULL, [source] [nvarchar](200) NOT NULL, [databaseTable] [nvarchar](100) NULL, [startTime] [datetime] NOT NULL, [endTime] [datetime] NULL, [runStatus] [nvarchar](20) NOT NULL, [expectedRowCount] [bigint] NOT NULL, [sourceRowCount] [bigint] NOT NULL, [destinationRowCount] [bigint] NOT NULL, PRIMARY KEY CLUSTERED ( [connectionRunId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: StoredProcedure [dbo].[sp_onConnectionFinish] Script Date: 4/25/2017 8:02:32 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE proc [dbo].[sp_onConnectionFinish] @connectionRunId bigint, @runStatus nvarchar(20), @sourceRowCount bigint, @destinationRowCount bigint, @finishTime datetime as BEGIN update connection_log set runStatus = @runStatus, sourceRowCount = @sourceRowCount, destinationRowCount = @destinationRowCount, endTime = @finishTime where connectionRunId = @connectionRunId END GO /****** Object: StoredProcedure [dbo].[sp_onConnectionStart] Script Date: 4/25/2017 8:02:32 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE PROCEDURE [dbo].[sp_onConnectionStart] @connectionName nvarchar(200), @connectionId bigint, @connectionRunId bigint, @source nvarchar(200), @databaseTable nvarchar(100), @startTime datetime, @expectedRowCount bigint AS BEGIN if exists (select * from connection_log where ConnectionRunId = @connectionRunId) BEGIN delete connection_log where ConnectionRunId = @connectionRunId END insert into connection_log (connectionRunId, connectionId, connectionName, source, databaseTable, startTime, endTime, runStatus, expectedRowCount, sourceRowCount, destinationRowCount) values (@connectionRunId, @connectionId, @connectionName, @source, @databaseTable, @startTime, null, 'Running', isnull(@expectedRowCount, 0), 0, 0) END GO
- On the Connect home page, click Create Connection > Create.
- Enter a name and description for your connection.
- Enable the connection to run on a schedule.
Select Enable. You can set up the schedule when you configure the rest of the connection. If the schedule is not enabled, the connection only runs when you manually run it.
- (Optional) Set the logging level.
By default, the logging is set to Information. Set the log level to Trace or Debug if you are debugging the connection. To reduce the amount of logging, you can set the log level to Warning, Error, or Fatal.
The connection source determines what data you are sending to the destination. This data is usually information from Tanium, such as a saved question, question log, system status, or event. The settings vary depending on which source you choose.
The action history is a record of all actions issued by console operators. To view this record in Tanium, click Actions > Action History. For more information, see Tanium Platform User Guide: Managing Action History.
Tanium Server keeps detailed audit logs for server configuration and settings changes. However, accessing these logs requires direct access to the Tanium database. To access the audit logs, you can set them up as a data source in Tanium Connect.
If Days to Collect is blank, all data is collected on the first connection run. Subsequent connection runs grab data from the last successful run date to the most recent record.
If Days to Collect is set to a value, every connection run is limited to that many days of data. Add a standard New Items filter to prevent duplication.
Tanium solutions, like Tanium™ Detect and Tanium™ Network Quarantine, can forward events to Connect as a data source. These events can then be used as a connection source in a connection and sent to any of the available connection destinations. For more information, see Tanium Threat Response User Guide: Set up Tanium Connect forwarding and Tanium Network Quarantine User Guide: Configuring notifications.
Palo Alto WildFire
Integration between Tanium and WildFire takes a list of confirmed malware from a Palo Alto firewall and requests a full report from the WildFire system. The full malware report is then converted into a standard indicator of compromise (IOC) and passed to the Detect system for multiple endpoint compromise detection. For more information, see Configuring Palo Alto Networks WildFire and Tanium Detect.
The question history log is a history of every question that has been asked. When you are using the question log as a data source in Connect, you can filter the log in several ways to reduce the total volume of data being sent. For more information, see Tanium Platform User Guide: Question History.
A saved question is a Tanium question that you want to ask on a repeated basis. For more information about saved questions, see Tanium Interact User Guide: Saving Questions. You can use the following settings for saved questions:
|Flatten||You might want to enable the Flatten setting to process results as individual records. For example, you might want to get notified when you see a new MD5 hash on a machine. Without the Flatten setting enabled, the entire data set that is retrieved by the saved question from a machine, such as all MD5 hashes, is considered to be a single record. Any change that is made to this data set shows up in the destination. By enabling the Flatten setting, Connect processes the new hashes on an individual basis (one MD5 hash from one machine) instead of all hashes from a machine as a single record.|
|Hide Errors||If the saved question returns an error, you can use the Hide Errors setting to prevent the error results from getting sent to the destination.|
|Hide No Results||If the saved question returns [No results], you can use the Hide No Results setting to prevent this result from being sent to the destination.|
|Recent||If you want to include results from machines that are offline, select Recent, which returns the most recent answer to the saved question for the offline endpoint.|
|Answer Complete Percent||
Results are returned when the saved question returns the configured complete percent value. Any results that come in after the configured percent value has passed are not sent to the destination. If you are finding that the data returned from the saved question is incomplete in your destination, you can disable this setting by setting it to 0. If disabled, all data is returned after the timeout passes.
|Timeout||Minutes to wait for clients to reply before returning processed results when Answer Complete Percent is set to 0. If the Answer Complete Percent value is not met at the end of the time limit, then the connection run is marked as a failure.|
|Batch Size||Number of rows that are returned for the saved question results at one time. This setting might vary depending on your destination.|
Use the server information in the following location as a connection source: https://<tanium_server>/info.json.
System Status includes the state of all the endpoints, including some useful information about the endpoint like IP Address, position in the network, and the last time it registered with the Tanium Server. For more information about the System Status data, see Tanium Platform User Guide: Monitoring System Status.
Tanium Asset comes with a set of predefined reports to help you prepare for audit and inventory activities. You can also create your own custom reports and views. For each report or view, you can create a connection that specifies a report or view as a data source. Currently supported destinations include Email, File, HTTP, Socket Receiver, Splunk, and SQL Server. For more information, see Tanium Asset User Guide: Asset overview.
Tanium Comply comes with a set of predefined reports and allows you to create custom reports to help support enterprise compliance goals. For Vulnerability reports, you can create a connection that specifies a report as a data source. For more information, see Tanium Comply User Guide: Comply overview.
Tanium Discover contains reports that maintain an inventory of interfaces in your environment. For each report, you can create a connection that specifies a report as a data source. For more information, see Tanium Discover User Guide: Discover overview.
Tanium Reputation is an aggregated repository of reputation data from various sources, including Palo Alto WildFire, ReversingLabs, and VirusTotal. You can choose which type of status to include, such as only malicious or suspicious content. You can choose to include the full report, which includes the detailed information from the reputation source, not just the status of the reputation item. You must have one or more reputation sources configured to get information from this connection source. For more information, see Tanium Reputation User Guide: Reputation overview.
Tanium™ Threat Response
Tanium Threat Response contains audit reports for actions that were performed in Threat Response. For each report, you can create a connection that specifies a report as a data source. For more information, see Tanium Threat Response User Guide: Threat Response overview.
Tanium Trends provides data visualization panels from saved question or module sources. You can create boards that organize one or more panels. For each board, you can create a connection that specifies a board as a data source in HTML format. Valid destinations are AWS S3, Email, or File. For more information, see Tanium Trends User Guide: Trends overview.
Specify information about how to connect to your database application server, test the database application server connection, and configure which data is added to the database tables and rows.
- In the Destination section, choose SQL Server.
- Specify information about your database application server, including server name, user ID, and password.
If you are using Windows Authentication, specify the user name in the following format: domain_name\login_name.
- To test the connection to the database application server, click Connect. If the connection to the database application server is successful, the field for database name is displayed.
- Choose the correct database table. The field for the schema is displayed. Choose the schema. The field for the table is displayed. Enter a new table name to be created or choose an existing table name.
- (Optional) Configure advanced settings. Expand Advanced Settings.
You can configure timeouts, batch size, and stored procedures.
Stored procedure fields and parameters are available only if your database already has stored procedures configured.
(Optional) You can use filters to modify the data that you are getting from your connection source before it is sent to the destination.
For more information about the types of filters you can configure, see Reference: Filtering options.
(Optional) You can view and change the mapping of the data from your source to the columns in your database table.
If the source is a saved question, the following columns are also available for writing to the SQL Server destination:
- Connection Run ID: The ID of the connection run.
- Band Number: The source row number. When flattened, this number is in each row that is associated with a row number that is received from Tanium.
- Row Number: The destination row number. This value is a counter for the rows that are output per connection run. The value is unique for each connection run.
You can change the Destination name of each column and Value Type to force the column to be a String, Numeric, or DateTime value.
If you choose Numeric for the value, you can specify a default value that is used if the data cannot be coerced into a numeric value. You can specify any negative or positive number.
If you choose DateTime for the value, specify the Date/Time format that you want to use for the column. For more information about using a variable, see Timestamp variables.
When you edit a connection with the SQL Server destination, the columns that are in the database are not displayed by default. You can edit the text fields for the columns, but you must provide the User Name and Password and click Connect in the Source and Destination section to view the columns from the database.
Connections can run at a highly configurable time interval, such as multiple times per hour, day, week, or month.
Update the schedule:
- Use the Generate Cron tab to build a schedule based on some common time intervals. This tab generates a Cron expression.
- To view or edit the Cron expression directly, click the Edit Cron Expression tab.
A quick reference to Cron syntax follows. You can use Crontab to build a Cron expression.
┌──────────── minute │ ┌────────── hour │ │ ┌──────── day of month │ │ │ ┌────── month │ │ │ │ ┌──── day of week │ │ │ │ │ │ │ │ │ │ * * * * *
Each asterisk is a field that must be included in the Cron expression. The field value can either be an asterisk (any value) or one of the following values:
|day of month||1-31|
|day of week (Sunday is 0 and 7)||0-7|
- Click Create Connection > Create. When the connection gets created, your new connection displays in the list on the Connections page.
- To view details about when the connection is running, click the name of the connection. On the resulting connection details page, click the Runs tab.
- To view individual run logs, click the link in the Status column in the Runs table.
Your connection is listed on the Connections page and data is sent to the Microsoft SQL Server at the interval you specified. You might want to check your database table to verify that the data is being exported correctly.
Last updated: 8/23/2019 10:26 AM | Feedback