Collecting data from endpoints with Live Response

A critical step in the incident response process is the collection of data from compromised endpoints for further forensic analysis. Threat Response provides a feature called Live Response that you can use to collect specific information from endpoints to use for forensic analysis, data correlation, and to investigate potentially compromised systems with a customizable and extensible framework.

Live Response collects forensic information from endpoints, and transfers the results to a network location that you specify in a package. The Live Response package contains configuration files that identify the data to collect, and where to copy the data. Specify the data that you want to collect from endpoints, and the network destination to save the collected files.

Destinations

A destination is a location to save forensic data. The server that receives information from Live Response can be Azure, an Amazon S3 Bucket, or a server that communicates over SFTP, SCP, or SMB (Windows only - SMB destinations are not included in Live Response packages for macOS and Linux.) protocols.

For SSH (SFTP/SCP) destinations, a user with write access to the share on the destination is required. Consider modifying the /etc/ssh/sshd_config file on the server to allow only SFTP or SCP access. A best practice is to use Linux SFTP/SCP destinations for SCP/SFTP transfers.

The key exchange algorithms supported by Live Response for SSH destinations include:

  • [email protected]
  • ecdh-sha2-nistp256
  • ecdh-sha2-nistp384
  • ecdh-sha2-nistp521
  • diffie-hellman-group14-sha1
  • diffie-hellman-group1-sha1
  • diffie-hellman-group-exchange-sha256

At least one of these algorithms must be supported by the server for SSH (SFTP/SCP) destinations.

For an SMB copy location, the system account is used. SMB shares work with domain joined endpoints. Either the specific endpoint must have write access, or the domain computers group must have write access. Required advanced permissions:

  • Create files / write data
  • Create folders / append data
  • Write attributes

For Amazon S3 Bucket copy locations, ensure that clients are synchronized with a time server. Transfers fail if the client time differs from the server time by more than 15 minutes.

Do not use SMB transfer destinations when a system has been quarantined by Tanium. Live Response uses domain authentication for transfers. When a system is quarantined it cannot reauthorize with the domain and authentication fails.

  1. From the Threat Response menu, click Management > Live Response. Click Create > Destination.
  2. In the General Information section, provide a name and description for the destination.
  3. Select a destination type. Available destination types are S3, SSH, and SMB. The destination type that you select determines the types of required setting information. Refer to destination types for more information.
  4. Click Save.

Be aware that keys you use to authenticate destinations should be considered public. Do not use keys that provide access to sensitive information, and ensure the keys are limited in scope. Do not allow keys to read, overwrite, or delete data.

Destination types

Different types of destinations require different settings.

There is no option for disabling hostkey verification for SSH destinations in Live Response for Threat Response.

Azure Destinations

For Azure destinations, the following settings are required:

SettingDescription

Storage Account

Provide the name of the storage account that contains Azure Storage data objects. The storage account provides a unique namespace for Azure Storage data that you can access from HTTP or HTTPS.

ContainerProvide the name of the Azure Container Instance where Live Response data is saved.
Key File

Provide the storage account access key to construct a connection string to access Azure Storage. The values in the connection string are used to construct the Authorization header that is passed to Azure Storage.

SAS Token

A shared access signature (SAS) is a token that permits delegated access to resources in your storage account. The SAS token encapsulates all of the information needed to authorize a request to Azure Storage on the URL.

Maximum AttemptsSpecifies the maximum number of attempts an operation will be tried before producing an error.
Simultaneous UploadsSpecifies the number of concurrent uploads that can occur.
Connection TimeoutThe amount of time to attempt to establish a connection.

S3 Destinations

For S3 destinations, the following settings are required:

SettingDescription

Bucket

The name of the S3 Bucket. When using an S3 bucket as a destination make sure that clients are synchronized with a time server. Transfers fail if the client time differs from the server time by more than 15 minutes.

Access Key ID

An ID that corresponds with a secret access key. For example, AKIAIOSFODNN7EXAMPLE

Secret Access Key

A secret key that corresponds with an access key ID. For example, wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY. Manage your access keys as securely as you do your user name and password.

When you create access keys, you create the access key ID and secret access key as a set. During access key creation, AWS gives you one opportunity to view and download the secret access key part of the access key. If you do not download the key or if you lose the key, you can delete the access key and then create a new one.

Region An explicitly defined S3 region.
HostThe fully qualified domain name of the host.
PortThe port to use for the connection for the destination. The default is 443.
Use SSLSSL encryption is enabled.
Force Path Style TypesForces API calls to use path-style URLs where the bucket name is part of the URL path for accessing buckets.
Connection TimeoutThe amount of time to attempt to establish a connection.
Remote PathA path on the destination where data is collected.

SSH Destinations

For SSH destinations, the following settings are required:

SettingDescription

Protocol

Select SFTP or SCP as the protocol to transfer collected files to the destination.

Authentication Type

Private Key or Password. The type of protocol that you select determines whether you are prompted to provide a private key or a password to authenticate with the destination.

Host

The fully qualified domain name of the host.

Port The port to use for the SSH connection for the destination. The default is 22.
UsernameThe user name for the connection to the destination.
Password or Private Key

The password for the user name, or a private key to authenticate the connection to the destination. An RSA key must be base64 encoded before you enter it into the private key field.

You cannot password protect private keys used for live response destinations.

In PowerShell you can convert to base64 encoding using the following command:

[Convert]::Tobase64String([System.IO.File]::ReadAllBytes('<filepath>')) | clip

Adding | clip to the end of the command sends the base64 output directly to the clipboard for pasting. If you try to copy and paste from the command line output, it is possible to introduce carriage returns which break the input and produce the error "invalid character" in the Tanium data entry console.


On macOS and Linux you can convert to base64 encoding using the following command:

cat <filepath> | base64 -w 0

Although the normal openSSH format is assumed to be base64, encode the entire key again before uploading.


Known Hosts The content of an SSH known hosts file. See Generate known_hosts and test connections
Connection TimeoutThe amount of time to attempt to establish a connection.
Remote PathA path on the destination where data is collected. This path is relative to the home directory of the present user. Absolute paths are not supported.

SMB Destinations

The SMB transfer protocol is only supported on Windows operating systems. SMB destinations are not included in Live Response packages for macOS and Linux.

For SMB destinations, the following settings are required:

SettingDescription
Universal Naming Convention

The UNC path of the destination.  For example, \\server\folder

Generate known_hosts and test connections

The Threat Response - Live Response [Windows] package contains taniumfiletransfer_32.exe and taniumfiletransfer_64.exe. When the package is deployed, the file that is appropriate for endpoint is copied to the endpoint and renamed to taniumfiletransfer.exe. Download taniumfiletransfer_32.exe or taniumfiletransfer_64.exe from the Threat Response - Live Response [Windows] package.

To see a list of supported protocols, run one of the following commands depending on the architecture you are using:

taniumfiletransfer_64 protocol

taniumfiletransfer_32 protocol

To see details about a specific protocol, including options for the protocol connection string, run: taniumfiletransfer_64 protocol <protocol>

To generate a known_hosts file for use with SFTP or SCP:

  1. Connect to the host you want to get a capture from.
  2. Get the target IP you want to transfer the file to.
  3. Run the following command: C:\> taniumfiletransfer_64.exe ssh-keyscan <host> > known_hosts on Windows endpoints or ssh-keyscan <host> > known_hosts on Linux and macOS endpoints.

To update the known_hosts file in the live response package paste the contents of the file that you generated to the known_hosts field in the destination. The ssh-keyscan command produces a file with all SSH fingerprints commented with a # character. Edit the file that is created by using this command and remove the # before the fingerprints that should be accepted. A description of the fingerprints is provided with two ## characters and these should be left as comments. If you use a non-standard port, the output in the known_hosts file needs to be edited to reference that port. Lines starting with a double-hash ## need to be edited. For example, if you are using port 222, update the line:
## Host: 192.168.0.113:22 (192.168.0.113:22)...

to:
## Host: 192.168.0.113:222 (192.168.0.113:222)...

If the known_hosts file is not edited to reference the correct port, Live Response encounters a failure. The known-hosts file must be ASCII encoded.

Alternatively you can perform the following steps:

  1. Generate the destination package. See Collect data from endpoints.
  2. Delete the known_hosts file in the Threat Response - Live Response [OS] package.
  3. Add the file you generated with no changes. Note that a Linux package cannot be updated from a Windows endpoint due to the changes Windows makes to the file format.
  4. Save the package.

To test sending a file via an SFTP string enter the following command:

C:\> taniumfiletransfer_64.exe send test_file -d sftp://<user>:<password>@<host>:/<dir>#knownHostsFile=known_hosts

Collections

A collection defines the data to collect from an endpoint. The following configurations are provided with Live Response:

  • Standard Collection: Use for default data. The standard collection contains file collectors to collect specific files from endpoints. See File Collector Sets for a reference of the file collectors that are contained in each type of collection. The following data is captured by default, and is configurable in the standard collection:
    • Enumerate Running Processes
    • Enumerate Process Modules

    • Enumerate System Drivers

    • Analyze Windows Prefetch

    • Analyze Windows Amcache

    • Analyze Windows Shimcache

    • Analyze Windows Filesystem .LNK Files

    • Analyze Windows Scheduled Jobs

    • Analyze Windows Recent Files Cache

    • Analyze Windows User Assist Recent Applications

    • Analyze Network Connection Details

    • Analyze Process and File Handle Details

    • Analyze System Startup

  • Extended Collection: Use to collect the same data as the standard collection, plus more file based artifacts, such as the kernel, the Master File Table, USN Journal, event logs, registry hive files, and so on. The extended collection contains file collectors to collect specific files from endpoints. See File Collector Sets for a reference of the file collectors that are contained in each type of collection. The following data is configurable in the extended collection:
    • Enumerate Running Processes
    • Enumerate Process Modules

    • Enumerate System Drivers

    • Analyze Windows Prefetch

    • Analyze Windows Amcache

    • Analyze Windows Shimcache

    • Analyze Windows Filesystem .LNK Files

    • Analyze Windows Scheduled Jobs

    • Analyze Windows Scheduled Tasks Via Schtasks

    • Analyze Windows Recent Files Cache

    • Analyze Windows User Assist Recent Applications

    • Collect Parsed UsnJrnl Entries

    • Analyze Network Connection Details

    • Analyze Process and File Handle Details

    • Analyze System Startup Applications

    • Collect Recorder Database Snapshot

The option to Collect Recorder Database Snapshot enables you to collect a snapshot of either recorder.db or monitor.db from endpoints. Collect Recorder Database Snapshot creates a snapshot of a recorder database - whether or not it is encrypted - and adds the snapshot to the collection. The snapshot that this module creates is removed from the endpoint when the collection has completed. By default, recorder database snapshots are saved in a folder named RecorderSnapshot on a path that corresponds with the name of the endpoint. For example, <base_directory>\<endpoint_name>\collector\RecorderSnapshot\<database_name>.db.

  • Memory Collection: Use for memory acquisition. The memory collection contains file collectors to collect specific files from endpoints. See File Collector sets for a reference of the file collectors that are contained in each type of collection. Memory data is configurable in the memory collection.

You can create a custom configuration to collect specific data from endpoints.

  1. From the Threat Response menu, click Management > Live Response. Click Create > Collection.
  2. In the General Information section, provide a name and description for the collection.
  3. Select the modules that you want to include in the data collection. A module is a functional area of forensic investigation. For example, the Network Connections module collects data that is helpful to understanding network connections that the endpoint has been involved in. The operating system icons next to each module show the operating systems to which the modules apply.
  4. Select the File Collector sets that you want to include in the collection. See File Collectors for more information.
  5. Under Script Sets, select the script sets that you want to include in the collection. See Script Sets for more information.
  6. Click Save.

File Collector sets

File collector sets to define the types of files that you want to collect from endpoints. For example, you can select all files of a specific type, or files that reside on a specific path. Live Response on Windows collects alternate data streams. The name of the alternate data stream is appended to the regular data stream preceded by an underscore. For example, if an alternate data stream named hidden_datastream exists for a file named hosts, this alternate data stream would be collected as <path>\hosts_hidden_datastream.

When setting a maximum recursive depth, enter -1 to represent unlimited.

File Collector
set
File CollectorFeatured in
Collection
Operating
System
PathFile PatternMaximum
recursive
depth
Maximum
files to
collect
Hosts FileWindows Hosts FileStandard, ExtendedWindows%systemdrive%\
windows\system32\
drivers\etc
(^hosts$)11
Non-Windows Hosts File

Standard, Extended

Linux, Mac/etc(^hosts$)11
Etc Folder TreeEtc

Standard, Extended

Linux, Mac/etc.*15Unlimited

Shell History FilesPowerShell HistoryStandard, ExtendedWindows%userprofile%\
AppData\Roaming\
Microsoft\Windows\
PowerShell\PSReadline\
^ConsoleHost_history.txt$01
Bourne Again (bash) Shell HistoryStandard, ExtendedLinux, Mac$HOME^\.bash_history$01
Bourne (sh) Shell HistoryStandard, ExtendedLinux, Mac$HOME^\.sh_history$01
Bourne Again (bash) SessionsStandard, ExtendedMac$HOME/.bash_sessions.*history.*15Unlimited
Secure Shell (SSH) FilesUser's Known HostsStandard, ExtendedLinux, Mac$HOME/.ssh^known_hosts$01
User's Authorized KeysStandard, ExtendedLinux, Mac$HOME/.ssh^authorized_keys$01
Current SSH UsersStandard, ExtendedLinux, Mac/var/run^utmp.*01
SSH Logon LogoffStandard, ExtendedLinux/var/log^wtmp.*0Unlimited
Failed SSH LogonStandard, ExtendedLinux/var/log^btmp.*0Unlimited
SSH Last Logged On UsersStandard, ExtendedLinux/var/log^lastlog$0Unlimited
SSH Daemon ConfigurationStandard, ExtendedLinux, Mac/etc/ssh^sshd_config$0Unlimited
SSH Client ConfigurationStandard, ExtendedLinux, Mac/etc/ssh^ssh_config$0Unlimited
Systemd Folder TreeSystemdStandard, ExendedLinux/etc/systemd/system.*15Unlimited
Kext DetailsKext DetailsStandard, ExtendedMac/var/db/
SystemConfiguration
^KextPolicy$15Unlimited
Kext Details (v11+)Standard, ExtendedMac/var/db/
SystemPolicyConfiguration
^KextPolicy$15Unlimited
Master File TableWindows Master File TableExtendedWindows%systemdrive%(\$MFT$)11
UsnJrnlUsnJrnlExtendedWindows%systemdrive%\$Extend(%.UsnJrnl$)11
KernelWindows KernelExended, MemoryWindows%systemdrive%\windows\
system32\
ntoskrnl\.exe11
System Registry HivesWindows System Registry HivesExtendedWindows%systemdrive%\windows\
system32\config\
((^system$)|(^security$)|
(^software$)|(^sam$))
14
User Registry HivesWindows User Registry HivesExtendedWindows%userprofile%\(^ntuser\.dat$)2Unlimited
Windows Event LogsWindows Event LogsExtendedWindows%systemdrive%\windows\
system32\winevt\logs\
.*\.evtx12000
Windows Prefetch FilesWindows Prefetch FilesExtendedWindows%systemroot%\prefetch\(.*\.pf)|(layout\.ini)|(.*\.db)|(pfsvperfstats\.bin)1Unlimited
Chrome User DataWindows Chrome User Data - CacheExtendedWindows%LOCALAPPDATA%\Google\
Chrome\User Data\
Default\Cache\
.*0Unlimited
Windows Chrome User Data - Local StorageExtendedWindows%LOCALAPPDATA%\Google\
Chrome\User Data\
Default\Local Storage\
.*0Unlimited
Windows Chrome User Data - ProfileExtendedWindows%LOCALAPPDATA%\Google\
Chrome\User Data\
Default\
.*0Unlimited
MacOS Chrome DataExtendedMac$HOME/Library/Application Support/Google/Chrome.*9Unlimited
Linux Chrome DataExtendedLinux$HOME/.config/google-chrome.*9Unlimited
Tanium Trace DatabaseWindows Tanium Trace DatabaseExtendedWindows%TANIUMDIR%\^monitor\.db(\-)
*(wal|shm|journal)*$
0Unlimited
Tanium Index DatabaseWindows Tanium Index DatabaseExtendedWindows%TANIUMDIR%\Tools\EPI\^EndpointIndex\.db(\-)*(wal|shm|
journal)*$
0Unlimited
Shell Configuration FilesBourne Again (bash) SettingsExtendedLinux, Mac$HOME^\.bash(rc|_profile|
_aliases)$
01
C Shell (csh and tcsh) SettingsExtendedLinux, Mac$HOME^\.(tcshrc|
cshrc)$
01
Available ShellsAvailable ShellsExtendedLinux, Mac/etc^shells$01
Passwd and Group FilesPasswd and Group FilesExtendedLinux,Mac/etc^(passwd|group)$0Unlimited
Shadow FilesShadow FilesExtendedLinux, Mac/etc^(shadow|gshadow|

master\.shadow)$
0Unlimited
Sudoers ConfigurationSudoers FileExtendedLinux, Mac/etc^sudoers$0Unlimited
Sudoers.d Folder ContentsExtendedLinux, Mac/etc/sudoers.d.*15Unlimited
Mount PointsMount PointsExtendedLinux/etc^fstab$01
NFS Mount PointsExtendedLinux/etc^exports.*01
Preload Shared LibrariesLD Preload Shared LibrariesExtendedLinux/etcld\.so.*15Unlimited
LD Preload Shared Libraries Configuration DirectoryExtendedLinux/etc/ld.so.conf.d.*0Unlimited
Auditd Configuration and RulesLD Preload Shared LibrariesExtendedLinux/etc/audit.*15Unlimited
RPM GPG KeysRPM GPG KeysExtendedLinux/etc/pki/rpm-gpg.*15Unlimited
SSL/TLS Certificates and PKISSL/TLS Certificates DirectoryExtendedLinux/etc/pki/tls.*15Unlimited
SSL/TLS Certificate Authority DirectoryExtendedLinux/etc/pki/CA.*15Unlimited
User Recently Used/Deleted FilesRecently Used GTK FilesExtendedLinux$HOME/.local/sharerecently-used\
.xbel
15Unlimited
Recently Deleted InfoExtendedLinux$HOME/.local/share/
Trash/info
.*15Unlimited
Recently Deleted FilesExtendedLinux$HOME/.local/share/
Trash/files
.*15Unlimited
User Vim ConfigurationVim InfoExtendedLinux, Mac$HOME^\.viminfo$0Unlimited
Non-Windows Vim ConfigurationExtendedLinux, Mac$HOME^\.vimrc$0Unlimited
Windows Vim ConfigurationExtendedWindows%homepath%\^_vimrc$0Unlimited
User Less HistoryLess HistoryExtendedLinux, Mac$HOME^\.lesshst$0Unlimited
User Database HistoryDatabase HistoryExtendedLinux, Mac$HOME^\.(psql|mysql|
sqlite)_history$
0Unlimited
Cron SettingsCron FilesExtendedLinux, Mac/etc/cron.*15Unlimited
Cron LogsExtendedLinux, Mac/var/logcron.*15Unlimited
Spotlight InformationSpotlight Disk DatabaseExtendedMac/.Spotlight-V100/
Store-V2
.*store\\.db15Unlimited
User Spotlight ShortcutsExtendedMac$HOME/Library/
Preferences/
^com\\.apple\
\.spotlight\\.plist$
15Unlimited
User Spotlight ShortcutsExtendedMac$HOME/Library/
Application Support/
^com\\.apple\
\.spotlight\\.Shortcuts$
15Unlimited
User and Application AnalyticsMacOS User and Analytics InformationExtendedMac/var/db/CoreDuet/Knowledge^knowledgeC\\.db$15Unlimited
MacOS Per-User and Analytics InformationExtendedMac$HOME/Library/Application Support/Knowledge/^knowledgeC\\.db$15Unlimited
Program Execution ReportsProgram Execution ReportsExtendedMac/Library/Logs/
DiagnosticReports
.*\\.core_analytics15Unlimited
Program Execution Aggregate AnalyticsExtendedMac/private/var/db/
analyticsd/aggregates
.*15Unlimited
Domain User InformationDomain User InformationExtendedMac/private/var/db/
ConfigurationProfiles/Store/
ConfigProfiles\
\.binary
15Unlimited
User Mail DataUser Mail DataExtendedMac$HOME/Library/Mail/.*15Unlimited
Microsoft OfficeMacOS Microsoft Office ArtifactsExtendedMac$HOME/Library/Group Containers/^MicrosoftRegistrationDB\
\.reg$
15Unlimited
Deleted UsersMacOS Deleted UsersExtendedMac/Library/Preferences/^com\\.apple\\.preferences\
\.accounts\\.plist$
15Unlimited
NotificationsMacOS Notifications Database FilesExtendedMac/var/folders/^db(-wal|-shm)*$15Unlimited
Event MonitorMacOS Event MonitorExtendedMac/System/Library/LaunchDaemons/^com\\.apple\\.emond\\.plist$15Unlimited
DHCP LeasesMacOS DHCP LeasesExtendedMac/private/var/db/dhcpclient/leases/.*15Unlimited
MacOS Finder InformationMacOS Finder MRUExtendedMac$HOME/Library/Preferences/^com\\.apple\\.finder\
\.plist$
15Unlimited
MacOS Recently Opened ItemsExtendedMac$HOME/Library/Preferences^com\\.apple\\.recentitems\
\.plist$
15Unlimited
MacOS Recently Opened Items 11+ExtendedMac$HOME/Library/Application Support/com.apple.sharedfilelist.*15Unlimited
Network Services, Settings, and LogsMacOS Network Services and SettingsExtendedMac/Library/Preferences/
SystemConfiguration/
^preferences\\.plist$15Unlimited
VPN LogExtendedMac/var/log^ppp\\.out$15Unlimited
Daily Network LogExtendedMac/var/log^daily\\.out$15Unlimited
Network UsageExtendedMac/var/networkd^netusage\\.sqlite$15Unlimited
MacOS Remembered NetworksExtendedMac/Library/Preferences/
SystemConfigurations/
^com\\.apple\\.airport\
\.preferences\\.plist$
15Unlimited
Update and Backup InformationMacOS Last UpdateExtendedMac/Library/Preferences/^com\\.apple\\.SoftwareUpdate\
\.plist$
15Unlimited
MacOS Last Time Machine BackupExtendedMac/Library/Preferences/^com\\.apple\\.TimeMachine\
\.plist$
15Unlimited
Scripting AdditionsMacOS Scripting AdditionsExtendedMac/System/Library/
ScriptingAdditions/
.*15Unlimited
System AdministratorsMacOS System AdministratorsExtendedMac/var/db/dslocal/nodes/
Default/groups/
^admin\\.plist$15Unlimited
Authorization DatabaseMacOS Authorization DatabaseExtendedMac/var/db^auth\\.db$1Unlimited
Safari User DataMacOS Safari User DataExtendedMac$HOME/Library/Safari/^(History\\.db|Downloads\
\.plist|TopSites\\.plist|
RecentlyClosedTabs\
\.plist|Bookmarks\
\.plist|
CloudTabs\\.db)$
15Unlimited
Firefox User DataMacOS Firefox User DataExtendedMac$HOME/Library/Application Support/Firefox/Profiles/^(places\\.sqlite|addons\
\.json|extensions\\.sqlite|
extensions\\.json
|search\\.sqlite)$
15Unlimited
System Resource Utilization Management (SRUM) DatabaseSystem Resource Utilization Management (SRUM) DatabaseExtendedWindows%systemdrive%\windows
\system32\sru
^srudb.dat$01
Edge (Chromium) Internet Explorer Data Collection (Mac)MacOS Edge DataExtendedMac$HOME/Library/Application Support/Microsoft Edge/Default.*9Unlimited
MacOS Edge Data - CacheExtendedMac$HOME/Library/Caches/Microsoft Edge/Default/Cache.*9Unlimited
Edge (Chromium) Internet Explorer Data Collection (Windows)Windows Edge User Data - CacheExtendedWindows%LOCALAPPDATA%\Microsoft

\Edge\User Data\Default\Cache
.*0Unlimited
Windows Edge User Data - Local StorageExtendedWindows%LOCALAPPDATA%\Microsoft

\Edge\User Data\Default
.*0Unlimited
Windows IE v10-11ExtendedWindows%LOCALAPPDATA%\Microsoft

\Windows\WebCache
.*0Unlimited
Windows IE v8-9ExtendedWindows%LOCALAPPDATA%\Microsoft

\WindowsHistory
.*0Unlimited
Microsoft IE v6-7ExtendedWindows%LOCALAPPDATA%\LocalSettings

\History\History
.*0Unlimited
Microsoft IE v8-9 Index.datExtendedWindows%APPDATA%\Microsoft

\Windows\IEDownloadHistory\
index\.dat11
Microsoft IE 8-9 DownloadsExtendedWindows%APPDATA%\Microsoft

\Windows\IEDownloadHistory\
.*1Unlimited

You can create custom file collector sets.

  1. From the Threat Response menu, click Management > Live Response. Click Create > File Collector Set.
  2. In the General Information section, provide a name and description for the file collection set.
  3. Click Create File Collector.
  4. Provide a name for the file collector.
  5. Provide a path for files to collect. Paths support environment variables and regular expressions. For more information, see Regular expressions and environment variables.
  6. Provide a file pattern for the files to collect. File patterns support regular expressions. For more information, see Regular expressions and environment variables.
  7. Specify the maximum depth of directories to recurse from the path you provided.
  8. Specify the maximum number of files to collect.
  9. Select Raw to preserve the format of the files that are collected.
  10. Select the operating systems from which you want the file collector to collect files.
  11. Click the check mark in the top right to save the file collector.

  12. Click Save.

Script sets

You can configure scripts to run on endpoints when you deploy the collection. Supported scripting languages include PowerShell and Python.

  1. From the Threat Response menu, click Management > Live Response. Click Create > Script Set.
  2. In the General Information section, provide a name and description for the script set.
  3. Under Scripts click Add a Script.
  4. Provide a filename for the script.
  5. Select Python or PowerShell as the type of script.
  6. Provide any script arguments to use as part of running the script.
  7. Add the script source.
  8. Click Save.

Script output is saved in a file that has the same as the script, and has -results appended to the file extension. For example, a script named test.ps1 creates output in test.ps1-results. All standard output is directed to the collector directory.

Collect data from endpoints

To collect data from endpoints, deploy a Live Response package.

To prevent resource overload on endpoints, only issue this action manually. Do not create a scheduled action.

  1. From the Threat Response menu, click Response Activity > Create > Live Response.
  2. Target endpoints for data collection. Use an operating system-based question, for example: Get Computer Name from machines with Is Windows containing "True" .
  3. Select the endpoints from which you want to collect data and click Deploy Action.
  4. In the Deployment Package field, type Live Response.
  5. Select the package that matches the collection and destination settings that you want to deploy.
  6. In the Base Directory field, provide a directory name where files are placed as they are collected. This directory is created under the Remote Path value that you provide in the destination you are using for the Live Response package. For example, if you provide a Base Directory of MyCollection for an SSH destination where the Remote Path is FileCollection, the result would be /home/username/FileCollection/MyCollection since the remote path provided in SSH destinations is relative to the home directory of the present user. Depending on the type of destination, the location of the Remote Path can vary. For example, in SMB destinations it is explicit; whereas in SSH destinations it is relative to the home directory of the present user.

  7. Optionally select Flatten Output Files if you want all collected files placed in one directory where the filename includes the original path, but does not retain the folder structure.

  8. Click Show Preview to Continue.
  9. After you preview the list of endpoints to which the action is being deployed, click Deploy Action.

Threat Response tests the connection by writing a LRConnectionTestfile to the destination. If the write fails, the action tries the other destinations in the transfer configuration in the order they are listed in the configuration file. If all the connection tests fail, the action does not proceed.

Tanium shows the package as complete almost immediately after the package is downloaded on the endpoints. This completion is not accurate because Live Response runs in detached mode. File transfers continue after the action completes.

The actual time to complete the transfer depends on the endpoint activity and connection speed between the endpoint and the destination system.

For macOS and Linux endpoints, memory data that is transferred to a destination is packaged in a ZIP file. For example, if you selected memory details as an included module, Live Response creates a ZIP file that contains a raw memory dump and additional system files. You can analyze this data with a tool such as Winpmem or Volexity Surge. For Windows endpoints, memory data is collected as one or more raw files.

Collect logs

In addition to the standard action logs on the endpoint (Tanium_Client_Location\Downloads\Action_###\Action_####.log), a log file of activities resides in the same directory. This file follows the naming convention: YYYYMMDDhhmm_LR.log.

When collection completes, the YYYYMMDDhhmm_LR.log is copied to the destination. The action log is not copied to the destination.

Use both the action log and the Live Response log to troubleshoot problems. The action log captures messages written to standard error (stderr).

Regular expressions and environment variables

File patterns support regular expression syntax.

The following table provides some example patterns to show how Live Response uses both regular expressions and environment variables on Windows, Linux, and macOS endpoints.

Example Live Response taskOperating systemPathFile patternExplanation
Collect host fileWindows%systemdrive%\windows\
system32\drivers\etc
^hosts$Collects a single file called 'hosts' from an appropriate directory using the %systemdrive% environment variable to avoid assuming the system drive is C:.
Linux/macOS/etc^hosts$ In this example, hosts matches.
Collect Bash History of every userWindowsNot ApplicableNot ApplicableNot Applicable
Linux/macOS$HOME/.bash_history$A file name that matches .bash_history. The forward slash in the file pattern /.bash_history$ is necessary since an environment variable ($HOME) is used for a path and the environment variable does not include a trailing forward slash.

Collect a file named findme.txt from platform rootWindowsC:\^findme.txt$The filename starts with findme.txt
Linux/macOS/^findme.txt$The filename starts with findme.txt

Any environment variables that you use resolve as described in the following table. Paths containing environment variables apply to all user profiles on the endpoint.

Environment variableSupported operating systemCorresponding value
%appdata% WindowsC:\Users\username\appdata\roaming
%homepath%Windows\Users\username
%localappdata%WindowsC:\Users\username\appdata\local
%psmodulepath%WindowsC:\Users\username\documents\windowspowershell\modules
%temp%WindowsC:\Users\username\appdata\local\temp
%tmp%WindowsC:\Users\username\appdata\local\temp
%userprofile%WindowsC:\Users\username
%taniumdir%WindowsThe Tanium Client directory. Defaults are:
\Program Files\Tanium\Tanium Client\ (32-bit OS )
\Program Files (x86)\Tanium\Tanium Client\ (64-bit OS)
$TANIUMDIRLinux, MacThe Tanium Client directory. Defaults are:
/Library/Tanium/TaniumClient/ (macOS)
/opt/Tanium/TaniumClient/ (Linux)
$HOMELinux, Mac

All user home directories that do not have a home directory set to blocklisted shells, and match shells that are listed in /etc/shells file.

If there is no /etc/shells file, all shells are allowed.

Environment variables that are local to the endpoint are supported. For example, if %SYSTEMROOT% is set on an endpoint to expand to C:\WINDOWS, you can use such a variable on a path.