Extracting Activity Data
This workflow guide demonstrates how to programmatically query and export Shield activity logs for analysis, reporting, and SIEM integration.
Overview
Shield logs every transaction it processes, creating a comprehensive audit trail of:
- Sensitive data detection events
- Obfuscation actions applied
- User and application context
- Request/response metadata
- Rule and policy matches
This guide covers the complete workflow for extracting this data using Shield's Activities API, from simple queries to CSV exports for analysis in Excel, business intelligence tools, or security platforms.
Workflow Steps
Step 1: Understanding the Query Process
Shield uses a two-step query process:
- Build Query - Convert simple filter criteria into Shield's advanced query syntax
- Execute Query - Retrieve activities matching the query
This approach allows you to build queries programmatically without manually constructing complex query strings.
API Endpoints:
POST /api/activities/convertsearch- Convert simple filters to advanced query syntaxGET /api/activities- Retrieve activities using the advanced queryGET /api/activities/csv- Export activities directly to CSV format
Step 2: Set Up Authentication
All API requests require authentication using an API key as a Bearer token.
Configure Authentication
Step 3: Build Your Query
Use the /api/activities/convertsearch endpoint to convert simple filter criteria into the advanced query syntax.
Activities from the Last 7 Days
Build Query for Last 7 Days
def build_query(filters):
"""Convert simple filters to advanced query syntax."""
query_request = {"simpleToAdvanced": filters}
response = requests.post(
f"{BASE_URL}/api/activities/convertsearch",
headers=HEADERS,
json=query_request
)
response.raise_for_status()
return response.json()["simpleToAdvanced"]
# Query for last 7 days
query = build_query({
"timestamp": {
"withinLast": {"days": 7, "hours": 0, "minutes": 0}
}
})
print(f"Advanced Query: {query}")
async function buildQuery(filters) {
// Convert simple filters to advanced query syntax
const queryRequest = { simpleToAdvanced: filters };
const response = await axios.post(
`${BASE_URL}/api/activities/convertsearch`,
queryRequest,
{ headers: HEADERS }
);
return response.data.simpleToAdvanced;
}
// Query for last 7 days
const query = await buildQuery({
timestamp: {
withinLast: { days: 7, hours: 0, minutes: 0 }
}
});
console.log(`Advanced Query: ${query}`);
Sensitive Data Detection Events
Query for Detection Events
# Query for activities where sensitive data was detected
curl -X POST "https://your-shield-host:8080/api/activities/convertsearch" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"simpleToAdvanced": {
"detected": ["true"],
"timestamp": {"withinLast": {"days": 1, "hours": 0, "minutes": 0}}
}
}'
Specific Data Types
Query by Data Type
# First, get available data type IDs
DATATYPES=$(curl -s "https://your-shield-host:8080/api/datatypes" \
-H "Authorization: Bearer $API_KEY")
# Find SSN and credit card data type IDs
SSN_ID=$(echo "$DATATYPES" | jq -r '.items[] | select(.type=="US_SSN") | .id')
CC_ID=$(echo "$DATATYPES" | jq -r '.items[] | select(.type=="CREDIT_CARD") | .id')
# Query for activities with SSN or credit card detection
curl -X POST "https://your-shield-host:8080/api/activities/convertsearch" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"simpleToAdvanced\": {
\"detectedDatatypes\": [\"$SSN_ID\", \"$CC_ID\"],
\"timestamp\": {\"withinLast\": {\"days\": 7, \"hours\": 0, \"minutes\": 0}}
}
}"
# First, get available data type IDs
datatypes_response = requests.get(
f"{BASE_URL}/api/datatypes",
headers=HEADERS
).json()
# Find SSN and credit card data type IDs
ssn_id = next((dt["id"] for dt in datatypes_response["items"] if dt["type"] == "US_SSN"), None)
cc_id = next((dt["id"] for dt in datatypes_response["items"] if dt["type"] == "CREDIT_CARD"), None)
# Query for activities with SSN or credit card detection
query = build_query({
"detectedDatatypes": [ssn_id, cc_id],
"timestamp": {"withinLast": {"days": 7, "hours": 0, "minutes": 0}}
})
// First, get available data type IDs
const datatypesResponse = await axios.get(
`${BASE_URL}/api/datatypes`,
{ headers: HEADERS }
);
// Find SSN and credit card data type IDs
const ssnId = datatypesResponse.data.items.find(dt => dt.type === 'US_SSN')?.id;
const ccId = datatypesResponse.data.items.find(dt => dt.type === 'CREDIT_CARD')?.id;
// Query for activities with SSN or credit card detection
const query = await buildQuery({
detectedDatatypes: [ssnId, ccId],
timestamp: { withinLast: { days: 7, hours: 0, minutes: 0 } }
});
User-Specific Activities
Query by Username
# Query for specific user's activities
curl -X POST "https://your-shield-host:8080/api/activities/convertsearch" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"simpleToAdvanced": {
"usernames": ["john.doe@company.com"],
"timestamp": {
"inTheRange": {
"start": "2024-01-01T00:00:00Z",
"end": "2024-01-31T23:59:59Z"
}
}
}
}'
Step 4: Retrieve Activities
Use the /api/activities endpoint with the advanced query to retrieve matching activities.
Retrieve Activities
# Retrieve activities matching the query
QUERY="your-advanced-query-here"
RESPONSE=$(curl -s -G "https://your-shield-host:8080/api/activities" \
-H "Authorization: Bearer $API_KEY" \
--data-urlencode "search=$QUERY" \
--data-urlencode "skip=0" \
--data-urlencode "take=100" \
--data-urlencode "sortBy=timestamp desc")
# Display results
echo "$RESPONSE" | jq '{
total: .count,
retrieved: (.items | length),
items: .items
}'
def get_activities(query, skip=0, take=100):
"""Retrieve activities matching the query."""
import urllib.parse
params = {
"search": query,
"skip": skip,
"take": take,
"sortBy": "timestamp desc"
}
response = requests.get(
f"{BASE_URL}/api/activities",
headers=HEADERS,
params=params
)
response.raise_for_status()
return response.json()
# Get activities
activities = get_activities(query)
print(f"Total activities: {activities['count']}")
print(f"Retrieved: {len(activities['items'])} activities")
async function getActivities(query, skip = 0, take = 100) {
// Retrieve activities matching the query
const params = {
search: query,
skip: skip,
take: take,
sortBy: 'timestamp desc'
};
const response = await axios.get(
`${BASE_URL}/api/activities`,
{
headers: HEADERS,
params: params
}
);
return response.data;
}
// Get activities
const activities = await getActivities(query);
console.log(`Total activities: ${activities.count}`);
console.log(`Retrieved: ${activities.items.length} activities`);
Step 5: Process the Results
Activity records contain comprehensive metadata. Here's how to extract key information:
Process Activity Records
# Extract key information from activity records
echo "$RESPONSE" | jq -r '.items[] |
"--- Activity \(.id) ---",
"Timestamp: \(.timestamp | strftime("%Y-%m-%d %H:%M:%S"))",
"URL: \(.url)",
"Username: \(.username // "N/A")",
"Application: \(.app // "N/A")",
(if .datatypesDetected then
"Detected Data Types: \([.datatypesDetected[].type] | join(", "))"
else empty end),
(if .datatypesObfuscated then
"Obfuscated Data Types: \([.datatypesObfuscated[].type] | join(", "))"
else empty end),
(if .rules then
"Rules Matched: \([.rules[].name] | join(", "))"
else empty end),
""'
def process_activities(activities):
"""Extract key information from activity records."""
for activity in activities["items"]:
print(f"\n--- Activity {activity['id']} ---")
print(f"Timestamp: {datetime.fromtimestamp(activity['timestamp'])}")
print(f"URL: {activity['url']}")
print(f"Username: {activity.get('username', 'N/A')}")
print(f"Application: {activity.get('app', 'N/A')}")
# Data types detected
if activity.get("datatypesDetected"):
detected = [dt["type"] for dt in activity["datatypesDetected"]]
print(f"Detected Data Types: {', '.join(detected)}")
# Data types obfuscated
if activity.get("datatypesObfuscated"):
obfuscated = [dt["type"] for dt in activity["datatypesObfuscated"]]
print(f"Obfuscated Data Types: {', '.join(obfuscated)}")
# Rules matched
if activity.get("rules"):
rules = [rule["name"] for rule in activity["rules"]]
print(f"Rules Matched: {', '.join(rules)}")
process_activities(activities)
function processActivities(activities) {
// Extract key information from activity records
for (const activity of activities.items) {
console.log(`\n--- Activity ${activity.id} ---`);
console.log(`Timestamp: ${new Date(activity.timestamp * 1000).toLocaleString()}`);
console.log(`URL: ${activity.url}`);
console.log(`Username: ${activity.username || 'N/A'}`);
console.log(`Application: ${activity.app || 'N/A'}`);
// Data types detected
if (activity.datatypesDetected) {
const detected = activity.datatypesDetected.map(dt => dt.type);
console.log(`Detected Data Types: ${detected.join(', ')}`);
}
// Data types obfuscated
if (activity.datatypesObfuscated) {
const obfuscated = activity.datatypesObfuscated.map(dt => dt.type);
console.log(`Obfuscated Data Types: ${obfuscated.join(', ')}`);
}
// Rules matched
if (activity.rules) {
const rules = activity.rules.map(rule => rule.name);
console.log(`Rules Matched: ${rules.join(', ')}`);
}
}
}
processActivities(activities);
Step 6: Pagination for Large Result Sets
For queries returning more than 100 activities, use pagination:
Paginate Large Result Sets
#!/bin/bash
# Retrieve all activities using pagination
QUERY="your-advanced-query-here"
BATCH_SIZE=100
SKIP=0
ALL_ACTIVITIES="[]"
while true; do
BATCH=$(curl -s -G "https://your-shield-host:8080/api/activities" \
-H "Authorization: Bearer $API_KEY" \
--data-urlencode "search=$QUERY" \
--data-urlencode "skip=$SKIP" \
--data-urlencode "take=$BATCH_SIZE" \
--data-urlencode "sortBy=timestamp desc")
# Merge items
ALL_ACTIVITIES=$(echo "$ALL_ACTIVITIES" | jq --argjson batch "$BATCH" '. + $batch.items')
# Get counts
TOTAL_COUNT=$(echo "$BATCH" | jq -r '.count')
CURRENT_COUNT=$(echo "$ALL_ACTIVITIES" | jq 'length')
echo "Retrieved $CURRENT_COUNT of $TOTAL_COUNT activities"
# Check if we've retrieved all activities
if [ "$CURRENT_COUNT" -ge "$TOTAL_COUNT" ]; then
break
fi
SKIP=$((SKIP + BATCH_SIZE))
done
echo "Total activities retrieved: $(echo "$ALL_ACTIVITIES" | jq 'length')"
def get_all_activities(query, batch_size=100):
"""Retrieve all activities using pagination."""
all_activities = []
skip = 0
while True:
batch = get_activities(query, skip=skip, take=batch_size)
all_activities.extend(batch["items"])
print(f"Retrieved {len(all_activities)} of {batch['count']} activities")
# Check if we've retrieved all activities
if len(all_activities) >= batch["count"]:
break
skip += batch_size
return all_activities
# Get all activities (handles pagination automatically)
all_activities = get_all_activities(query)
print(f"Total activities retrieved: {len(all_activities)}")
async function getAllActivities(query, batchSize = 100) {
// Retrieve all activities using pagination
const allActivities = [];
let skip = 0;
while (true) {
const batch = await getActivities(query, skip, batchSize);
allActivities.push(...batch.items);
console.log(`Retrieved ${allActivities.length} of ${batch.count} activities`);
// Check if we've retrieved all activities
if (allActivities.length >= batch.count) {
break;
}
skip += batchSize;
}
return allActivities;
}
// Get all activities (handles pagination automatically)
const allActivities = await getAllActivities(query);
console.log(`Total activities retrieved: ${allActivities.length}`);
Step 7: Export to CSV
For analysis in Excel or BI tools, export activities directly to CSV format:
Export to CSV
import urllib.parse
def export_to_csv(query, filename="activities.csv"):
"""Export activities to CSV file."""
# CSV endpoint supports query parameter for token
params = {
"token": API_KEY,
"search": query
}
# Build URL with query parameters
url = f"{BASE_URL}/api/activities/csv?" + urllib.parse.urlencode(params)
response = requests.get(url)
response.raise_for_status()
# Save to file
with open(filename, 'wb') as f:
f.write(response.content)
print(f"Exported to {filename}")
# Export query results to CSV
export_to_csv(query, "shield_activities.csv")
async function exportToCSV(query, filename = 'activities.csv') {
// Export activities to CSV file
const params = new URLSearchParams({
token: API_KEY,
search: query
});
const response = await axios.get(
`${BASE_URL}/api/activities/csv?${params.toString()}`,
{ responseType: 'arraybuffer' }
);
// Save to file
fs.writeFileSync(filename, response.data);
console.log(`Exported to ${filename}`);
}
// Export query results to CSV
await exportToCSV(query, 'shield_activities.csv');
Note: The CSV export endpoint accepts the API key as either a Authorization: Bearer header or a token query parameter. The query parameter method is provided for compatibility with BI tools that cannot set custom headers.
Step 8: Advanced Filtering
Combine multiple filter criteria for more specific queries:
Compliance Reporting
# Complex query: Credit card detections in production environment
# by specific user group in the last hour
# First get the required IDs
DATATYPES=$(curl -s "https://your-shield-host:8080/api/datatypes" \
-H "Authorization: Bearer $API_KEY")
CC_ID=$(echo "$DATATYPES" | jq -r '.items[] | select(.type=="CREDIT_CARD") | .id')
APPS=$(curl -s "https://your-shield-host:8080/api/apps" \
-H "Authorization: Bearer $API_KEY")
PROD_APP_ID=$(echo "$APPS" | jq -r '.items[] | select(.name=="Production API") | .id')
# Build complex query
COMPLEX_QUERY=$(curl -s -X POST "https://your-shield-host:8080/api/activities/convertsearch" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"simpleToAdvanced\": {
\"detectedDatatypes\": [\"$CC_ID\"],
\"userGroups\": [\"Finance\"],
\"apps\": [\"$PROD_APP_ID\"],
\"obfuscated\": [\"true\"],
\"timestamp\": {\"withinLast\": {\"days\": 0, \"hours\": 1, \"minutes\": 0}}
}
}" | jq -r '.simpleToAdvanced')
# Get activities
curl -s -G "https://your-shield-host:8080/api/activities" \
-H "Authorization: Bearer $API_KEY" \
--data-urlencode "search=$COMPLEX_QUERY" | jq
# Complex query: Credit card detections in production environment
# by specific user group in the last hour
# First get the required IDs
datatypes = requests.get(f"{BASE_URL}/api/datatypes", headers=HEADERS).json()
cc_id = next(dt["id"] for dt in datatypes["items"] if dt["type"] == "CREDIT_CARD")
apps = requests.get(f"{BASE_URL}/api/apps", headers=HEADERS).json()
prod_app_id = next(app["id"] for app in apps["items"] if app["name"] == "Production API")
query = build_query({
"detectedDatatypes": [cc_id],
"userGroups": ["Finance"],
"apps": [prod_app_id],
"obfuscated": ["true"],
"timestamp": {"withinLast": {"days": 0, "hours": 1, "minutes": 0}}
})
activities = get_activities(query)
// Complex query: Credit card detections in production environment
// by specific user group in the last hour
// First get the required IDs
const datatypes = await axios.get(
`${BASE_URL}/api/datatypes`,
{ headers: HEADERS }
);
const ccId = datatypes.data.items.find(dt => dt.type === 'CREDIT_CARD')?.id;
const apps = await axios.get(
`${BASE_URL}/api/apps`,
{ headers: HEADERS }
);
const prodAppId = apps.data.items.find(app => app.name === 'Production API')?.id;
const query = await buildQuery({
detectedDatatypes: [ccId],
userGroups: ['Finance'],
apps: [prodAppId],
obfuscated: ['true'],
timestamp: { withinLast: { days: 0, hours: 1, minutes: 0 } }
});
const activities = await getActivities(query);
Complete Example Script
Here's a complete script that queries activities and exports to both JSON and CSV:
Complete Activity Extraction Script
#!/bin/bash
# shield_activity_extractor.sh
# Queries Shield activities and exports to JSON and CSV formats
# Configuration
API_KEY="YOUR_API_KEY"
BASE_URL="https://your-shield-host:8080"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
JSON_FILE="activities_${TIMESTAMP}.json"
CSV_FILE="activities_${TIMESTAMP}.csv"
# Build query for last 7 days with sensitive data detected
echo "Building query..."
QUERY=$(curl -s -X POST "$BASE_URL/api/activities/convertsearch" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"simpleToAdvanced": {
"detected": ["true"],
"timestamp": {"withinLast": {"days": 7, "hours": 0, "minutes": 0}}
}
}' | jq -r '.simpleToAdvanced')
# Retrieve all activities with pagination
echo "Retrieving activities..."
ALL_ACTIVITIES="[]"
SKIP=0
BATCH_SIZE=100
while true; do
BATCH=$(curl -s -G "$BASE_URL/api/activities" \
-H "Authorization: Bearer $API_KEY" \
--data-urlencode "search=$QUERY" \
--data-urlencode "skip=$SKIP" \
--data-urlencode "take=$BATCH_SIZE" \
--data-urlencode "sortBy=timestamp desc")
ALL_ACTIVITIES=$(echo "$ALL_ACTIVITIES" | jq --argjson batch "$BATCH" '. + $batch.items')
TOTAL_COUNT=$(echo "$BATCH" | jq -r '.count')
CURRENT_COUNT=$(echo "$ALL_ACTIVITIES" | jq 'length')
echo "Retrieved $CURRENT_COUNT of $TOTAL_COUNT activities"
if [ "$CURRENT_COUNT" -ge "$TOTAL_COUNT" ]; then
break
fi
SKIP=$((SKIP + BATCH_SIZE))
done
# Export to JSON
echo "$ALL_ACTIVITIES" | jq '.' > "$JSON_FILE"
echo "Exported $(echo "$ALL_ACTIVITIES" | jq 'length') activities to $JSON_FILE"
# Export to CSV
curl -s -G "$BASE_URL/api/activities/csv" \
--data-urlencode "token=$API_KEY" \
--data-urlencode "search=$QUERY" \
-o "$CSV_FILE"
echo "Exported to $CSV_FILE"
# Print summary
FIRST_TS=$(echo "$ALL_ACTIVITIES" | jq -r '.[0].timestamp')
LAST_TS=$(echo "$ALL_ACTIVITIES" | jq -r '.[-1].timestamp')
echo ""
echo "=== Summary ==="
echo "Total activities: $(echo "$ALL_ACTIVITIES" | jq 'length')"
echo "Time range: $(date -d @$LAST_TS '+%Y-%m-%d %H:%M:%S') to $(date -d @$FIRST_TS '+%Y-%m-%d %H:%M:%S')"
#!/usr/bin/env python3
"""
Shield Activity Extractor
Queries Shield activities and exports to JSON and CSV formats.
"""
import requests
import json
import urllib.parse
from datetime import datetime
# Configuration
BASE_URL = "https://your-shield-host:8080"
API_KEY = "YOUR_API_KEY"
HEADERS = {"Authorization": f"Bearer {API_KEY}"}
def build_query(filters):
"""Convert simple filters to advanced query syntax."""
response = requests.post(
f"{BASE_URL}/api/activities/convertsearch",
headers=HEADERS,
json={"simpleToAdvanced": filters}
)
response.raise_for_status()
return response.json()["simpleToAdvanced"]
def get_all_activities(query, batch_size=100):
"""Retrieve all activities using pagination."""
all_activities = []
skip = 0
while True:
params = {
"search": query,
"skip": skip,
"take": batch_size,
"sortBy": "timestamp desc"
}
response = requests.get(
f"{BASE_URL}/api/activities",
headers=HEADERS,
params=params
)
response.raise_for_status()
batch = response.json()
all_activities.extend(batch["items"])
print(f"Retrieved {len(all_activities)} of {batch['count']} activities")
if len(all_activities) >= batch["count"]:
break
skip += batch_size
return all_activities
def export_to_csv(query, filename):
"""Export activities to CSV."""
params = {
"token": API_KEY,
"search": query
}
url = f"{BASE_URL}/api/activities/csv?" + urllib.parse.urlencode(params)
response = requests.get(url)
response.raise_for_status()
with open(filename, 'wb') as f:
f.write(response.content)
print(f"Exported to {filename}")
def main():
# Build query for last 7 days with sensitive data detected
query = build_query({
"detected": ["true"],
"timestamp": {"withinLast": {"days": 7, "hours": 0, "minutes": 0}}
})
# Retrieve all activities
activities = get_all_activities(query)
# Export to JSON
json_filename = f"activities_{datetime.now().strftime('%Y%m%d_%H%M%S')}.json"
with open(json_filename, 'w') as f:
json.dump(activities, f, indent=2)
print(f"Exported {len(activities)} activities to {json_filename}")
# Export to CSV
csv_filename = f"activities_{datetime.now().strftime('%Y%m%d_%H%M%S')}.csv"
export_to_csv(query, csv_filename)
# Print summary
print("\n=== Summary ===")
print(f"Total activities: {len(activities)}")
print(f"Time range: {datetime.fromtimestamp(activities[-1]['timestamp'])} to {datetime.fromtimestamp(activities[0]['timestamp'])}")
if __name__ == "__main__":
main()
#!/usr/bin/env node
/**
* Shield Activity Extractor
* Queries Shield activities and exports to JSON and CSV formats.
*/
const axios = require('axios');
const fs = require('fs');
// Configuration
const BASE_URL = 'https://your-shield-host:8080';
const API_KEY = 'YOUR_API_KEY';
const HEADERS = { 'Authorization': `Bearer ${API_KEY}` };
async function buildQuery(filters) {
// Convert simple filters to advanced query syntax
const response = await axios.post(
`${BASE_URL}/api/activities/convertsearch`,
{ simpleToAdvanced: filters },
{ headers: HEADERS }
);
return response.data.simpleToAdvanced;
}
async function getAllActivities(query, batchSize = 100) {
// Retrieve all activities using pagination
const allActivities = [];
let skip = 0;
while (true) {
const response = await axios.get(
`${BASE_URL}/api/activities`,
{
headers: HEADERS,
params: {
search: query,
skip: skip,
take: batchSize,
sortBy: 'timestamp desc'
}
}
);
const batch = response.data;
allActivities.push(...batch.items);
console.log(`Retrieved ${allActivities.length} of ${batch.count} activities`);
if (allActivities.length >= batch.count) {
break;
}
skip += batchSize;
}
return allActivities;
}
async function exportToCSV(query, filename) {
// Export activities to CSV
const response = await axios.get(
`${BASE_URL}/api/activities/csv`,
{
params: {
token: API_KEY,
search: query
},
responseType: 'arraybuffer'
}
);
fs.writeFileSync(filename, response.data);
console.log(`Exported to ${filename}`);
}
async function main() {
try {
// Build query for last 7 days with sensitive data detected
const query = await buildQuery({
detected: ['true'],
timestamp: { withinLast: { days: 7, hours: 0, minutes: 0 } }
});
// Retrieve all activities
const activities = await getAllActivities(query);
// Export to JSON
const timestamp = new Date().toISOString().replace(/[-:]/g, '').slice(0, 15);
const jsonFilename = `activities_${timestamp}.json`;
fs.writeFileSync(jsonFilename, JSON.stringify(activities, null, 2));
console.log(`Exported ${activities.length} activities to ${jsonFilename}`);
// Export to CSV
const csvFilename = `activities_${timestamp}.csv`;
await exportToCSV(query, csvFilename);
// Print summary
console.log('\n=== Summary ===');
console.log(`Total activities: ${activities.length}`);
const firstTime = new Date(activities[0].timestamp * 1000).toLocaleString();
const lastTime = new Date(activities[activities.length - 1].timestamp * 1000).toLocaleString();
console.log(`Time range: ${lastTime} to ${firstTime}`);
} catch (error) {
console.error('Error:', error.message);
process.exit(1);
}
}
main();
Available Filter Fields
Common filter fields you can use in queries:
| Field | Type | Description | Example |
|---|---|---|---|
timestamp |
time range | Activity timestamp | {"withinLast": {"days": 7, "hours": 0, "minutes": 0}} |
detected |
boolean | Sensitive data was detected | ["true"] or ["false"] |
obfuscated |
boolean | Data was masked | ["true"] or ["false"] |
blocked |
boolean | Request was blocked | ["true"] or ["false"] |
detectedDatatypes |
UUID[] | Data type UUIDs detected | ["uuid-1", "uuid-2"] |
obfuscatedDatatypes |
UUID[] | Data type UUIDs obfuscated | ["uuid-1", "uuid-2"] |
apps |
UUID[] | Application UUIDs | ["uuid-1"] |
rules |
UUID[] | Rule UUIDs matched | ["uuid-1"] |
usernames |
string[] | Username | ["john.doe@company.com"] |
userGroups |
string[] | User group | ["Finance", "Engineering"] |
url |
string[] | Request URL | ["https://api.example.com"] |
hostnames |
string[] | Hostname | ["api.example.com"] |
contentTypes |
string[] | Content type | ["application/json"] |
icapMode |
string[] | ICAP mode | ["REQMOD", "RESPMOD", "API"] |
For a complete list of filter fields, see the Activities API Reference.
Related Topics
- GET /api/activities - Complete API reference
- POST /api/activities/convertsearch - Query builder reference
- GET /api/activities/csv - CSV export reference
Troubleshooting
Empty Results
If your query returns no activities:
- Verify the time range includes activity periods
- Check filter values (UUIDs must match exactly)
- Try a broader query first, then add filters incrementally
Performance Issues
For queries spanning long time ranges:
- Use smaller time windows (e.g., daily instead of monthly)
- Export directly to CSV instead of loading into memory
- Use pagination with smaller batch sizes
Query Syntax Errors
Always use the convertsearch endpoint to build queries programmatically. Manual query construction is error-prone and not recommended.
For detailed troubleshooting, see the Authentication Troubleshooting Guide.