Collecting data from endpoints with Live Response
A critical step in the incident response process is the collection of data from compromised endpoints for further forensic analysis. Threat Response provides a feature called Live Response that you can use to collect specific information from endpoints to use for forensic analysis, data correlation, and to investigate potentially compromised systems with a customizable and extensible framework.
Live Response collects forensic information from endpoints, and transfers the results to a network location that you specify in a package. The Live Response package contains configuration files that identify the data to collect, and where to copy the data. Specify the data that you want to collect from endpoints, and the network destination to save the collected files.
Destinations
A destination is a location to save forensic data. The server that receives information from Live Response can be Azure, an Amazon S3 Bucket, or a server that communicates over SFTP, SCP, or SMB (Windows only - SMB destinations are not included in Live Response packages for macOS and Linux.) protocols.
For SSH (SFTP/SCP) destinations, a user with write access to the share on the destination is required. Consider modifying the /etc/ssh/sshd_config file on the server to allow only SFTP or SCP access. A best practice is to use Linux SFTP/SCP destinations for SCP/SFTP transfers.
The key exchange algorithms supported by Live Response for SSH destinations include:
- [email protected]
- ecdh-sha2-nistp256
- ecdh-sha2-nistp384
- ecdh-sha2-nistp521
- diffie-hellman-group14-sha1
- diffie-hellman-group1-sha1
- diffie-hellman-group-exchange-sha256
At least one of these algorithms must be supported by the server for SSH (SFTP/SCP) destinations.
For an SMB copy location, the system account is used. SMB shares work with domain joined endpoints. Either the specific endpoint must have write access, or the domain computers group must have write access. Required advanced permissions:
- Create files / write data
- Create folders / append data
- Write attributes
For Amazon S3 Bucket copy locations, ensure that clients are synchronized with a time server. Transfers fail if the client time differs from the server time by more than 15 minutes.
Do not use SMB transfer destinations when a system has been quarantined by Tanium. Live Response uses domain authentication for transfers. When a system is quarantined it cannot reauthorize with the domain and authentication fails.
- From the Threat Response menu, click Management > Live Response. Click Create > Destination.
- In the General Information section, provide a name and description for the destination.
- Select a destination type. Available destination types are S3, SSH, and SMB. The destination type that you select determines the types of required setting information. Refer to destination types for more information.
- Click Save.
Be aware that keys you use to authenticate destinations should be considered public. Do not use keys that provide access to sensitive information, and ensure the keys are limited in scope. Do not allow keys to read, overwrite, or delete data.
Destination types
Different types of destinations require different settings.
There is no option for disabling hostkey verification for SSH destinations in Live Response for Threat Response.
Azure Destinations
For Azure destinations, the following settings are required:
Setting | Description |
---|---|
Storage Account | Provide the name of the storage account that contains Azure Storage data objects. The storage account provides a unique namespace for Azure Storage data that you can access from HTTP or HTTPS. |
Container | Provide the name of the Azure Container Instance where Live Response data is saved. |
Key File | Provide the storage account access key to construct a connection string to access Azure Storage. The values in the connection string are used to construct the Authorization header that is passed to Azure Storage. |
SAS Token | A shared access signature (SAS) is a token that permits delegated access to resources in your storage account. The SAS token encapsulates all of the information needed to authorize a request to Azure Storage on the URL. |
Maximum Attempts | Specifies the maximum number of attempts an operation will be tried before producing an error. |
Simultaneous Uploads | Specifies the number of concurrent uploads that can occur. |
Connection Timeout | The amount of time to attempt to establish a connection. |
S3 Destinations
For S3 destinations, the following settings are required:
Setting | Description |
---|---|
Bucket | The name of the S3 Bucket. When using an S3 bucket as a destination make sure that clients are synchronized with a time server. Transfers fail if the client time differs from the server time by more than 15 minutes. |
Access Key ID | An ID that corresponds with a secret access key. For example, AKIAIOSFODNN7EXAMPLE |
Secret Access Key | A secret key that corresponds with an access key ID. For example, wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY. Manage your access keys as securely as you do your user name and password. When you create access keys, you create the access key ID and secret access key as a set. During access key creation, AWS gives you one opportunity to view and download the secret access key part of the access key. If you do not download the key or if you lose the key, you can delete the access key and then create a new one. |
Region | An explicitly defined S3 region. |
Host | The fully qualified domain name of the host. |
Port | The port to use for the connection for the destination. The default is 443. |
Use SSL | SSL encryption is enabled. |
Force Path Style Types | Forces API calls to use path-style URLs where the bucket name is part of the URL path for accessing buckets. |
Connection Timeout | The amount of time to attempt to establish a connection. |
Remote Path | A path on the destination where data is collected. |
SSH Destinations
For SSH destinations, the following settings are required:
Setting | Description |
---|---|
Protocol | Select SFTP or SCP as the protocol to transfer collected files to the destination. |
Authentication Type | Private Key or Password. The type of protocol that you select determines whether you are prompted to provide a private key or a password to authenticate with the destination. |
Host | The fully qualified domain name of the host. |
Port | The port to use for the SSH connection for the destination. The default is 22. |
Username | The user name for the connection to the destination. |
Password or Private Key | The password for the user name, or a private key to authenticate the connection to the destination. An RSA key must be base64 encoded before you enter it into the private key field. You cannot password protect private keys used for live response destinations. In PowerShell you can convert to base64 encoding using the following command: Adding | clip to the end of the command sends the base64 output directly to the clipboard for pasting. If you try to copy and paste from the command line output, it is possible to introduce carriage returns which break the input and produce the error "invalid character" in the Tanium data entry console. On macOS and Linux you can convert to base64 encoding using the following command: cat <filepath> | base64 -w 0 Although the normal openSSH format is assumed to be base64, encode the entire key again before uploading. |
Known Hosts | The content of an SSH known hosts file. See Generate known_hosts and test connections |
Connection Timeout | The amount of time to attempt to establish a connection. |
Remote Path | A path on the destination where data is collected. This path is relative to the home directory of the present user. Absolute paths are not supported. |
SMB Destinations
The SMB transfer protocol is only supported on Windows operating systems. SMB destinations are not included in Live Response packages for macOS and Linux.
For SMB destinations, the following settings are required:
Setting | Description |
---|---|
Universal Naming Convention | The UNC path of the destination. For example, \\server\folder |
Generate known_hosts and test connections
The Threat Response - Live Response [Windows] package contains taniumfiletransfer_32.exe and taniumfiletransfer_64.exe. When the package is deployed, the file that is appropriate for endpoint is copied to the endpoint and renamed to taniumfiletransfer.exe. Download taniumfiletransfer_32.exe or taniumfiletransfer_64.exe from the Threat Response - Live Response [Windows] package.
To see a list of supported protocols, run one of the following commands depending on the architecture you are using:
taniumfiletransfer_64 protocol
taniumfiletransfer_32 protocol
To see details about a specific protocol, including options for the protocol connection string, run: taniumfiletransfer_64 protocol <protocol>
To generate a known_hosts file for use with SFTP or SCP:
- Connect to the host you want to get a capture from.
- Get the target IP you want to transfer the file to.
- Run the following command: C:\> taniumfiletransfer_64.exe ssh-keyscan <host> > known_hosts on Windows endpoints or ssh-keyscan <host> > known_hosts on Linux and macOS endpoints.
To update the known_hosts file in the live response package paste the contents of the file that you generated to the known_hosts field in the destination. The ssh-keyscan command produces a file with all SSH fingerprints commented with a # character. Edit the file that is created by using this command and remove the # before the fingerprints that should be accepted. A description of the fingerprints is provided with two ## characters and these should be left as comments. If you use a non-standard port, the output in the known_hosts file needs to be edited to reference that port. Lines starting with a double-hash ## need to be edited. For example, if you are using port 222, update the line:
## Host: 192.168.0.113:22 (192.168.0.113:22)...
to:
## Host: 192.168.0.113:222 (192.168.0.113:222)...
If the known_hosts file is not edited to reference the correct port, Live Response encounters a failure. The known-hosts file must be ASCII encoded.
Alternatively you can perform the following steps:
- Generate the destination package. See Collect data from endpoints.
- Delete the known_hosts file in the Threat Response - Live Response [OS] package.
- Add the file you generated with no changes. Note that a Linux package cannot be updated from a Windows endpoint due to the changes Windows makes to the file format.
- Save the package.
To test sending a file via an SFTP string enter the following command:
C:\> taniumfiletransfer_64.exe send test_file -d sftp://<user>:<password>@<host>:/<dir>#knownHostsFile=known_hosts
Collections
A collection defines the data to collect from an endpoint. The following configurations are provided with Live Response:
- Standard Collection: Use for default data. The standard collection contains file collectors to collect specific files from endpoints. See File Collector Sets for a reference of the file collectors that are contained in each type of collection. The following data is captured by default, and is configurable in the standard collection:
- Enumerate Running Processes
Enumerate Process Modules
Enumerate System Drivers
Analyze Windows Prefetch
Analyze Windows Amcache
Analyze Windows Shimcache
Analyze Windows Filesystem .LNK Files
Analyze Windows Scheduled Jobs
Analyze Windows Recent Files Cache
Analyze Windows User Assist Recent Applications
Analyze Network Connection Details
Analyze Process and File Handle Details
Analyze System Startup
- Extended Collection: Use to collect the same data as the standard collection, plus more file based artifacts, such as the kernel, the Master File Table, USN Journal, event logs, registry hive files, and so on. The extended collection contains file collectors to collect specific files from endpoints. See File Collector Sets for a reference of the file collectors that are contained in each type of collection. The following data is configurable in the extended collection:
- Enumerate Running Processes
Enumerate Process Modules
Enumerate System Drivers
Analyze Windows Prefetch
Analyze Windows Amcache
Analyze Windows Shimcache
Analyze Windows Filesystem .LNK Files
Analyze Windows Scheduled Jobs
Analyze Windows Scheduled Tasks Via Schtasks
Analyze Windows Recent Files Cache
Analyze Windows User Assist Recent Applications
Collect Parsed UsnJrnl Entries
Analyze Network Connection Details
Analyze Process and File Handle Details
Analyze System Startup Applications
Collect Recorder Database Snapshot
The option to Collect Recorder Database Snapshot enables you to collect a snapshot of either recorder.db or monitor.db from endpoints. Collect Recorder Database Snapshot creates a snapshot of a recorder database - whether or not it is encrypted - and adds the snapshot to the collection. The snapshot that this module creates is removed from the endpoint when the collection has completed. By default, recorder database snapshots are saved in a folder named RecorderSnapshot on a path that corresponds with the name of the endpoint. For example, <base_directory>\<endpoint_name>\collector\RecorderSnapshot\<database_name>.db.
- Memory Collection: Use for memory acquisition. The memory collection contains file collectors to collect specific files from endpoints. See File Collector sets for a reference of the file collectors that are contained in each type of collection. Memory data is configurable in the memory collection.
You can create a custom configuration to collect specific data from endpoints.
- From the Threat Response menu, click Management > Live Response. Click Create > Collection.
- In the General Information section, provide a name and description for the collection.
- Select the modules that you want to include in the data collection. A module is a functional area of forensic investigation. For example, the Network Connections module collects data that is helpful to understanding network connections that the endpoint has been involved in. The operating system icons next to each module show the operating systems to which the modules apply.
- Select the File Collector sets that you want to include in the collection. See File Collectors for more information.
- Under Script Sets, select the script sets that you want to include in the collection. See Script Sets for more information.
- Click Save.
File Collector sets
File collector sets to define the types of files that you want to collect from endpoints. For example, you can select all files of a specific type, or files that reside on a specific path. Live Response on Windows collects alternate data streams. The name of the alternate data stream is appended to the regular data stream preceded by an underscore. For example, if an alternate data stream named hidden_datastream exists for a file named hosts, this alternate data stream would be collected as <path>\hosts_hidden_datastream.
When setting a maximum recursive depth, enter -1 to represent unlimited.
File Collector set | File Collector | Featured in Collection | Operating System | Path | File Pattern | Maximum recursive depth | Maximum files to collect |
---|---|---|---|---|---|---|---|
Hosts File | Windows Hosts File | Standard, Extended | Windows | %systemdrive%\ windows\system32\ drivers\etc | (^hosts$) | 1 | 1 |
Non-Windows Hosts File | Standard, Extended | Linux, Mac | /etc | (^hosts$) | 1 | 1 | |
Etc Folder Tree | Etc | Standard, Extended | Linux, Mac | /etc | .* | 15 | Unlimited |
Shell History Files | PowerShell History | Standard, Extended | Windows | %userprofile%\ AppData\Roaming\ Microsoft\Windows\ PowerShell\PSReadline\ | ^ConsoleHost_history.txt$ | 0 | 1 |
Bourne Again (bash) Shell History | Standard, Extended | Linux, Mac | $HOME | ^\.bash_history$ | 0 | 1 | |
Bourne (sh) Shell History | Standard, Extended | Linux, Mac | $HOME | ^\.sh_history$ | 0 | 1 | |
Bourne Again (bash) Sessions | Standard, Extended | Mac | $HOME/.bash_sessions | .*history.* | 15 | Unlimited | |
Secure Shell (SSH) Files | User's Known Hosts | Standard, Extended | Linux, Mac | $HOME/.ssh | ^known_hosts$ | 0 | 1 |
User's Authorized Keys | Standard, Extended | Linux, Mac | $HOME/.ssh | ^authorized_keys$ | 0 | 1 | |
Current SSH Users | Standard, Extended | Linux, Mac | /var/run | ^utmp.* | 0 | 1 | |
SSH Logon Logoff | Standard, Extended | Linux | /var/log | ^wtmp.* | 0 | Unlimited | |
Failed SSH Logon | Standard, Extended | Linux | /var/log | ^btmp.* | 0 | Unlimited | |
SSH Last Logged On Users | Standard, Extended | Linux | /var/log | ^lastlog$ | 0 | Unlimited | |
SSH Daemon Configuration | Standard, Extended | Linux, Mac | /etc/ssh | ^sshd_config$ | 0 | Unlimited | |
SSH Client Configuration | Standard, Extended | Linux, Mac | /etc/ssh | ^ssh_config$ | 0 | Unlimited | |
Systemd Folder Tree | Systemd | Standard, Exended | Linux | /etc/systemd/system | .* | 15 | Unlimited |
Kext Details | Kext Details | Standard, Extended | Mac | /var/db/ SystemConfiguration | ^KextPolicy$ | 15 | Unlimited |
Kext Details (v11+) | Standard, Extended | Mac | /var/db/ SystemPolicyConfiguration | ^KextPolicy$ | 15 | Unlimited | |
Master File Table | Windows Master File Table | Extended | Windows | %systemdrive% | (\$MFT$) | 1 | 1 |
UsnJrnl | UsnJrnl | Extended | Windows | %systemdrive%\$Extend | (%.UsnJrnl$) | 1 | 1 |
Kernel | Windows Kernel | Exended, Memory | Windows | %systemdrive%\windows\ system32\ | ntoskrnl\.exe | 1 | 1 |
System Registry Hives | Windows System Registry Hives | Extended | Windows | %systemdrive%\windows\ system32\config\ | ((^system$)|(^security$)| (^software$)|(^sam$)) | 1 | 4 |
User Registry Hives | Windows User Registry Hives | Extended | Windows | %userprofile%\ | (^ntuser\.dat$) | 2 | Unlimited |
Windows Event Logs | Windows Event Logs | Extended | Windows | %systemdrive%\windows\ system32\winevt\logs\ | .*\.evtx | 1 | 2000 |
Windows Prefetch Files | Windows Prefetch Files | Extended | Windows | %systemroot%\prefetch\ | (.*\.pf)|(layout\.ini)|(.*\.db)|(pfsvperfstats\.bin) | 1 | Unlimited |
Chrome User Data | Windows Chrome User Data - Cache | Extended | Windows | %LOCALAPPDATA%\Google\ Chrome\User Data\ Default\Cache\ | .* | 0 | Unlimited |
Windows Chrome User Data - Local Storage | Extended | Windows | %LOCALAPPDATA%\Google\ Chrome\User Data\ Default\Local Storage\ | .* | 0 | Unlimited | |
Windows Chrome User Data - Profile | Extended | Windows | %LOCALAPPDATA%\Google\ Chrome\User Data\ Default\ | .* | 0 | Unlimited | |
MacOS Chrome Data | Extended | Mac | $HOME/Library/Application Support/Google/Chrome | .* | 9 | Unlimited | |
Linux Chrome Data | Extended | Linux | $HOME/.config/google-chrome | .* | 9 | Unlimited | |
Tanium Trace Database | Windows Tanium Trace Database | Extended | Windows | %TANIUMDIR%\ | ^monitor\.db(\-) *(wal|shm|journal)*$ | 0 | Unlimited |
Tanium Index Database | Windows Tanium Index Database | Extended | Windows | %TANIUMDIR%\Tools\EPI\ | ^EndpointIndex\.db(\-)*(wal|shm| journal)*$ | 0 | Unlimited |
Shell Configuration Files | Bourne Again (bash) Settings | Extended | Linux, Mac | $HOME | ^\.bash(rc|_profile| _aliases)$ | 0 | 1 |
C Shell (csh and tcsh) Settings | Extended | Linux, Mac | $HOME | ^\.(tcshrc| cshrc)$ | 0 | 1 | |
Available Shells | Available Shells | Extended | Linux, Mac | /etc | ^shells$ | 0 | 1 |
Passwd and Group Files | Passwd and Group Files | Extended | Linux,Mac | /etc | ^(passwd|group)$ | 0 | Unlimited |
Shadow Files | Shadow Files | Extended | Linux, Mac | /etc | ^(shadow|gshadow| master\.shadow)$ | 0 | Unlimited |
Sudoers Configuration | Sudoers File | Extended | Linux, Mac | /etc | ^sudoers$ | 0 | Unlimited |
Sudoers.d Folder Contents | Extended | Linux, Mac | /etc/sudoers.d | .* | 15 | Unlimited | |
Mount Points | Mount Points | Extended | Linux | /etc | ^fstab$ | 0 | 1 |
NFS Mount Points | Extended | Linux | /etc | ^exports.* | 0 | 1 | |
Preload Shared Libraries | LD Preload Shared Libraries | Extended | Linux | /etc | ld\.so.* | 15 | Unlimited |
LD Preload Shared Libraries Configuration Directory | Extended | Linux | /etc/ld.so.conf.d | .* | 0 | Unlimited | |
Auditd Configuration and Rules | LD Preload Shared Libraries | Extended | Linux | /etc/audit | .* | 15 | Unlimited |
RPM GPG Keys | RPM GPG Keys | Extended | Linux | /etc/pki/rpm-gpg | .* | 15 | Unlimited |
SSL/TLS Certificates and PKI | SSL/TLS Certificates Directory | Extended | Linux | /etc/pki/tls | .* | 15 | Unlimited |
SSL/TLS Certificate Authority Directory | Extended | Linux | /etc/pki/CA | .* | 15 | Unlimited | |
User Recently Used/Deleted Files | Recently Used GTK Files | Extended | Linux | $HOME/.local/share | recently-used\ .xbel | 15 | Unlimited |
Recently Deleted Info | Extended | Linux | $HOME/.local/share/ Trash/info | .* | 15 | Unlimited | |
Recently Deleted Files | Extended | Linux | $HOME/.local/share/ Trash/files | .* | 15 | Unlimited | |
User Vim Configuration | Vim Info | Extended | Linux, Mac | $HOME | ^\.viminfo$ | 0 | Unlimited |
Non-Windows Vim Configuration | Extended | Linux, Mac | $HOME | ^\.vimrc$ | 0 | Unlimited | |
Windows Vim Configuration | Extended | Windows | %homepath%\ | ^_vimrc$ | 0 | Unlimited | |
User Less History | Less History | Extended | Linux, Mac | $HOME | ^\.lesshst$ | 0 | Unlimited |
User Database History | Database History | Extended | Linux, Mac | $HOME | ^\.(psql|mysql| sqlite)_history$ | 0 | Unlimited |
Cron Settings | Cron Files | Extended | Linux, Mac | /etc/ | cron.* | 15 | Unlimited |
Cron Logs | Extended | Linux, Mac | /var/log | cron.* | 15 | Unlimited | |
Spotlight Information | Spotlight Disk Database | Extended | Mac | /.Spotlight-V100/ Store-V2 | .*store\\.db | 15 | Unlimited |
User Spotlight Shortcuts | Extended | Mac | $HOME/Library/ Preferences/ | ^com\\.apple\ \.spotlight\\.plist$ | 15 | Unlimited | |
User Spotlight Shortcuts | Extended | Mac | $HOME/Library/ Application Support/ | ^com\\.apple\ \.spotlight\\.Shortcuts$ | 15 | Unlimited | |
User and Application Analytics | MacOS User and Analytics Information | Extended | Mac | /var/db/CoreDuet/Knowledge | ^knowledgeC\\.db$ | 15 | Unlimited |
MacOS Per-User and Analytics Information | Extended | Mac | $HOME/Library/Application Support/Knowledge/ | ^knowledgeC\\.db$ | 15 | Unlimited | |
Program Execution Reports | Program Execution Reports | Extended | Mac | /Library/Logs/ DiagnosticReports | .*\\.core_analytics | 15 | Unlimited |
Program Execution Aggregate Analytics | Extended | Mac | /private/var/db/ analyticsd/aggregates | .* | 15 | Unlimited | |
Domain User Information | Domain User Information | Extended | Mac | /private/var/db/ ConfigurationProfiles/Store/ | ConfigProfiles\ \.binary | 15 | Unlimited |
User Mail Data | User Mail Data | Extended | Mac | $HOME/Library/Mail/ | .* | 15 | Unlimited |
Microsoft Office | MacOS Microsoft Office Artifacts | Extended | Mac | $HOME/Library/Group Containers/ | ^MicrosoftRegistrationDB\ \.reg$ | 15 | Unlimited |
Deleted Users | MacOS Deleted Users | Extended | Mac | /Library/Preferences/ | ^com\\.apple\\.preferences\ \.accounts\\.plist$ | 15 | Unlimited |
Notifications | MacOS Notifications Database Files | Extended | Mac | /var/folders/ | ^db(-wal|-shm)*$ | 15 | Unlimited |
Event Monitor | MacOS Event Monitor | Extended | Mac | /System/Library/LaunchDaemons/ | ^com\\.apple\\.emond\\.plist$ | 15 | Unlimited |
DHCP Leases | MacOS DHCP Leases | Extended | Mac | /private/var/db/dhcpclient/leases/ | .* | 15 | Unlimited |
MacOS Finder Information | MacOS Finder MRU | Extended | Mac | $HOME/Library/Preferences/ | ^com\\.apple\\.finder\ \.plist$ | 15 | Unlimited |
MacOS Recently Opened Items | Extended | Mac | $HOME/Library/Preferences | ^com\\.apple\\.recentitems\ \.plist$ | 15 | Unlimited | |
MacOS Recently Opened Items 11+ | Extended | Mac | $HOME/Library/Application Support/com.apple.sharedfilelist | .* | 15 | Unlimited | |
Network Services, Settings, and Logs | MacOS Network Services and Settings | Extended | Mac | /Library/Preferences/ SystemConfiguration/ | ^preferences\\.plist$ | 15 | Unlimited |
VPN Log | Extended | Mac | /var/log | ^ppp\\.out$ | 15 | Unlimited | |
Daily Network Log | Extended | Mac | /var/log | ^daily\\.out$ | 15 | Unlimited | |
Network Usage | Extended | Mac | /var/networkd | ^netusage\\.sqlite$ | 15 | Unlimited | |
MacOS Remembered Networks | Extended | Mac | /Library/Preferences/ SystemConfigurations/ | ^com\\.apple\\.airport\ \.preferences\\.plist$ | 15 | Unlimited | |
Update and Backup Information | MacOS Last Update | Extended | Mac | /Library/Preferences/ | ^com\\.apple\\.SoftwareUpdate\ \.plist$ | 15 | Unlimited |
MacOS Last Time Machine Backup | Extended | Mac | /Library/Preferences/ | ^com\\.apple\\.TimeMachine\ \.plist$ | 15 | Unlimited | |
Scripting Additions | MacOS Scripting Additions | Extended | Mac | /System/Library/ ScriptingAdditions/ | .* | 15 | Unlimited |
System Administrators | MacOS System Administrators | Extended | Mac | /var/db/dslocal/nodes/ Default/groups/ | ^admin\\.plist$ | 15 | Unlimited |
Authorization Database | MacOS Authorization Database | Extended | Mac | /var/db | ^auth\\.db$ | 1 | Unlimited |
Safari User Data | MacOS Safari User Data | Extended | Mac | $HOME/Library/Safari/ | ^(History\\.db|Downloads\ \.plist|TopSites\\.plist| RecentlyClosedTabs\ \.plist|Bookmarks\ \.plist| CloudTabs\\.db)$ | 15 | Unlimited |
Firefox User Data | MacOS Firefox User Data | Extended | Mac | $HOME/Library/Application Support/Firefox/Profiles/ | ^(places\\.sqlite|addons\ \.json|extensions\\.sqlite| extensions\\.json |search\\.sqlite)$ | 15 | Unlimited |
System Resource Utilization Management (SRUM) Database | System Resource Utilization Management (SRUM) Database | Extended | Windows | %systemdrive%\windows \system32\sru | ^srudb.dat$ | 0 | 1 |
Edge (Chromium) Internet Explorer Data Collection (Mac) | MacOS Edge Data | Extended | Mac | $HOME/Library/Application Support/Microsoft Edge/Default | .* | 9 | Unlimited |
MacOS Edge Data - Cache | Extended | Mac | $HOME/Library/Caches/Microsoft Edge/Default/Cache | .* | 9 | Unlimited | |
Edge (Chromium) Internet Explorer Data Collection (Windows) | Windows Edge User Data - Cache | Extended | Windows | %LOCALAPPDATA%\Microsoft \Edge\User Data\Default\Cache | .* | 0 | Unlimited |
Windows Edge User Data - Local Storage | Extended | Windows | %LOCALAPPDATA%\Microsoft \Edge\User Data\Default | .* | 0 | Unlimited | |
Windows IE v10-11 | Extended | Windows | %LOCALAPPDATA%\Microsoft \Windows\WebCache | .* | 0 | Unlimited | |
Windows IE v8-9 | Extended | Windows | %LOCALAPPDATA%\Microsoft \WindowsHistory | .* | 0 | Unlimited | |
Microsoft IE v6-7 | Extended | Windows | %LOCALAPPDATA%\LocalSettings \History\History | .* | 0 | Unlimited | |
Microsoft IE v8-9 Index.dat | Extended | Windows | %APPDATA%\Microsoft \Windows\IEDownloadHistory\ | index\.dat | 1 | 1 | |
Microsoft IE 8-9 Downloads | Extended | Windows | %APPDATA%\Microsoft \Windows\IEDownloadHistory\ | .* | 1 | Unlimited |
You can create custom file collector sets.
- From the Threat Response menu, click Management > Live Response. Click Create > File Collector Set.
- In the General Information section, provide a name and description for the file collection set.
- Click Create File Collector.
- Provide a name for the file collector.
- Provide a path for files to collect. Paths support environment variables and regular expressions. For more information, see Regular expressions and environment variables.
- Provide a file pattern for the files to collect. File patterns support regular expressions. For more information, see Regular expressions and environment variables.
- Specify the maximum depth of directories to recurse from the path you provided.
- Specify the maximum number of files to collect.
- Select Raw to preserve the format of the files that are collected.
- Select the operating systems from which you want the file collector to collect files.
Click the check mark in the top right to save the file collector.
- Click Save.
Script sets
You can configure scripts to run on endpoints when you deploy the collection. Supported scripting languages include PowerShell and Python.
- From the Threat Response menu, click Management > Live Response. Click Create > Script Set.
- In the General Information section, provide a name and description for the script set.
- Under Scripts click Add a Script.
- Provide a filename for the script.
- Select Python or PowerShell as the type of script.
- Provide any script arguments to use as part of running the script.
- Add the script source.
- Click Save.
Script output is saved in a file that has the same as the script, and has -results appended to the file extension. For example, a script named test.ps1 creates output in test.ps1-results. All standard output is directed to the collector directory.
Collect data from endpoints
To collect data from endpoints, deploy a Live Response package.
To prevent resource overload on endpoints, only issue this action manually. Do not create a scheduled action.
- From the Threat Response menu, click Response Activity > Create > Live Response.
- Target endpoints for data collection. Use an operating system-based question, for example: Get Computer Name from machines with Is Windows containing "True" .
- Select the endpoints from which you want to collect data and click Deploy Action.
- In the Deployment Package field, type Live Response.
- Select the package that matches the collection and destination settings that you want to deploy.
In the Base Directory field, provide a directory name where files are placed as they are collected. This directory is created under the Remote Path value that you provide in the destination you are using for the Live Response package. For example, if you provide a Base Directory of MyCollection for an SSH destination where the Remote Path is FileCollection, the result would be /home/username/FileCollection/MyCollection since the remote path provided in SSH destinations is relative to the home directory of the present user. Depending on the type of destination, the location of the Remote Path can vary. For example, in SMB destinations it is explicit; whereas in SSH destinations it is relative to the home directory of the present user.
Optionally select Flatten Output Files if you want all collected files placed in one directory where the filename includes the original path, but does not retain the folder structure.
- Click Show Preview to Continue.
- After you preview the list of endpoints to which the action is being deployed, click Deploy Action.
Threat Response tests the connection by writing a LRConnectionTestfile to the destination. If the write fails, the action tries the other destinations in the transfer configuration in the order they are listed in the configuration file. If all the connection tests fail, the action does not proceed.
Tanium shows the package as complete almost immediately after the package is downloaded on the endpoints. This completion is not accurate because Live Response runs in detached mode. File transfers continue after the action completes.
The actual time to complete the transfer depends on the endpoint activity and connection speed between the endpoint and the destination system.
For macOS and Linux endpoints, memory data that is transferred to a destination is packaged in a ZIP file. For example, if you selected memory details as an included module, Live Response creates a ZIP file that contains a raw memory dump and additional system files. You can analyze this data with a tool such as Winpmem or Volexity Surge. For Windows endpoints, memory data is collected as one or more raw files.
Collect logs
In addition to the standard action logs on the endpoint (Tanium_Client_Location\Downloads\Action_###\Action_####.log), a log file of activities resides in the same directory. This file follows the naming convention: YYYYMMDDhhmm_LR.log.
When collection completes, the YYYYMMDDhhmm_LR.log is copied to the destination. The action log is not copied to the destination.
Use both the action log and the Live Response log to troubleshoot problems. The action log captures messages written to standard error (stderr).
Regular expressions and environment variables
File patterns support regular expression syntax.
The following table provides some example patterns to show how Live Response uses both regular expressions and environment variables on Windows, Linux, and macOS endpoints.
Example Live Response task | Operating system | Path | File pattern | Explanation |
---|---|---|---|---|
Collect host file | Windows | %systemdrive%\windows\ system32\drivers\etc | ^hosts$ | Collects a single file called 'hosts' from an appropriate directory using the %systemdrive% environment variable to avoid assuming the system drive is C:. |
Linux/macOS | /etc | ^hosts$ | In this example, hosts matches. | |
Collect Bash History of every user | Windows | Not Applicable | Not Applicable | Not Applicable |
Linux/macOS | $HOME | /.bash_history$ | A file name that matches .bash_history. The forward slash in the file pattern /.bash_history$ is necessary since an environment variable ($HOME) is used for a path and the environment variable does not include a trailing forward slash. | |
Collect a file named findme.txt from platform root | Windows | C:\ | ^findme.txt$ | The filename starts with findme.txt |
Linux/macOS | / | ^findme.txt$ | The filename starts with findme.txt |
Any environment variables that you use resolve as described in the following table. Paths containing environment variables apply to all user profiles on the endpoint.
Environment variable | Supported operating system | Corresponding value |
---|---|---|
%appdata% | Windows | C:\Users\username\appdata\roaming |
%homepath% | Windows | \Users\username |
%localappdata% | Windows | C:\Users\username\appdata\local |
%psmodulepath% | Windows | C:\Users\username\documents\windowspowershell\modules |
%temp% | Windows | C:\Users\username\appdata\local\temp |
%tmp% | Windows | C:\Users\username\appdata\local\temp |
%userprofile% | Windows | C:\Users\username |
%taniumdir% | Windows | The Tanium Client directory. Defaults are: \Program Files\Tanium\Tanium Client\ (32-bit OS ) \Program Files (x86)\Tanium\Tanium Client\ (64-bit OS) |
$TANIUMDIR | Linux, Mac | The Tanium Client directory. Defaults are: /Library/Tanium/TaniumClient/ (macOS) /opt/Tanium/TaniumClient/ (Linux) |
$HOME | Linux, Mac | All user home directories that do not have a home directory set to blocklisted shells, and match shells that are listed in /etc/shells file. If there is no /etc/shells file, all shells are allowed. |
Environment variables that are local to the endpoint are supported. For example, if %SYSTEMROOT% is set on an endpoint to expand to C:\WINDOWS, you can use such a variable on a path. |
Last updated: 6/1/2023 1:38 PM | Feedback