Operational Information Gathering Methodologies of an Analyst

A brief analysis of intelligence activities abroad

Alessandro Del Bianco
AllertaSulWeb

--

In the realm of intelligence, the collection, analysis, and sharing of information constitute one of the cornerstones of a nation’s organization and security policies. It’s often a mysterious world, governed by secrecy, yet of vital importance for national defense, threat management, and the formulation of wide-ranging policies that impact millions of citizens.

The goal of intelligence is twofold: on one hand, to protect the country from both internal and external threats, ranging from terrorism to organized crime; on the other hand, to provide a strategic advantage, both in terms of security and in economic and geopolitical competition.

This article will delve into the intricate world of intelligence, examining the processes and methodologies employed by intelligence analysts. We will explore how information is gathered, both from human sources and open sources, and how this information is processed and interpreted to make critical decisions. Additionally, we will address the significance of encryption and communication security in a world where privacy and security often collide. Moreover, we’ll tackle the delicate issue of information sharing among agencies and nations, highlighting the critical questions it raises about transparency, accountability, and democratic oversight.

This journey into intelligence raises important ethical and political questions in an era where technology has made widespread surveillance and interception of communications possible. It leads us to reflect on the tensions between security and individual freedom, secrecy and public accountability. In the end, it leaves us with fundamental questions about the role of intelligence in a democratic and interconnected society.

Photo by Luke Chesser on Unsplash

Identity

An analyst on a mission must adopt a fictitious identity that seamlessly blends into the environment in which they operate. This identity can be that of a diplomat, a businessman, a journalist, or any other credible persona. For instance, if the analyst poses as a journalist, they must have a credible cover, such as writing actual articles or participating in journalism-related events. In some cases, it’s even possible to use a cover company with a legitimate business name, which may be registered with local authorities.

In some cases, analysts can operate abroad with the tacit consent of the host country. For example, an analyst may be accredited as a diplomat at an embassy and enjoy diplomatic immunity. This arrangement can be a part of broader diplomatic relations between the countries involved. Such agreements can facilitate access to sensitive information or influence the analyst’s activities abroad.

Engagement of External Companies

In some situations, analysts can establish agreements with foreign companies or organizations to provide credible cover. These companies may be engaged in legitimate business activities, but part of their operation is to provide fictitious identities. For example, an international business consulting firm could “employ” an analyst as a consultant, thus enabling them to operate without drawing unwanted attention.

Human Collection

Gathering information through direct interaction with human sources is an essential part of an analyst’s work abroad. This operational methodology involves various aspects, including recruitment techniques, source management, psychological manipulation tactics, and incentives.

Recruitment

Recruiting human sources is the process of persuading or compelling individuals to provide sensitive information. Recruitment techniques vary, but some include:

  • Financial incentives: Offering money or other valuable goods in exchange for information.
  • Psychological pressure: Exploiting personal situations, such as debts or threats, to compel the individual to cooperate.
  • Ideology: Using common ideological or political beliefs to persuade the individual to disclose information.
  • Moral leverage: Appealing to the individual’s morality to gain their cooperation.

Once recruited, sources must be managed carefully to ensure the continuity of cooperation. This requires empathy, attention, and the ability to maintain the source’s trust. Software such as encrypted databases or secure communication platforms can be used to manage information shared by sources securely.

Psychological Manipulation

Psychological manipulation is a complex and controversial aspect of analysts’ work. It involves a set of techniques aimed at influencing the behavior or decisions of individuals or groups through the use of psychological tactics. This process is based on an understanding of human psychology and the cognitive vulnerabilities of people. Let’s take a closer look at how psychological manipulation works and what tools and techniques are employed.

  • Incentives and Rewards: Analysts can use financial incentives, offers of political asylum, or promises of protection to persuade individuals to cooperate. These incentives can range from money to new identities.
  • Threats and Blackmail: On the other end of the spectrum, threats of violence, blackmail, or the disclosure of compromising information are used to compel people to cooperate.
  • Social Isolation: The individual can be isolated from their social and family ties, making them more vulnerable and dependent on the intelligence agency.
  • Mental Confusion: Confusion can be induced through the bombardment of conflicting information, confusing the target person and influencing their judgment.
  • Persuasion Techniques: Psychological persuasion can be used to convince a person that cooperation is in the interest of the common good or their own security.

For example, if one wanted to obtain information from a key source in a foreign country, they might start with a friendly approach, offering financial incentives in exchange for information. If the person is reluctant, veiled threats could be used to increase pressure. Meanwhile, the source could be isolated from their contacts and their perceptions of the situation manipulated. This combination of tactics can be employed to convince the source to cooperate.

Incentives

Incentives can be used to reward source cooperation. They may include:

  • Financial Benefits: Regular payments or bonuses based on the information provided.
  • Protection: Ensuring the source’s safety and protection from external threats.
  • Personal Advantages: Providing benefits such as visas or assistance in resolving personal issues.

Security and Communication Software

Communication between the analyst and sources must be highly secure. Software offering end-to-end encryption can be used to ensure that communications remain private. Additionally, the use of VPNs or Tor network services can help hide the origin of communications and protect the anonymity of sources.

Photo by Mathieu Stern on Unsplash

Electronic Surveillance

Electronic surveillance is a fundamental part of analysts’ activities as it enables them to gather valuable information through the interception of electronic communications.

Electronic Communication Interception involves collecting and monitoring communications such as phone calls, email messages, SMS, and data transmitted over computer networks. This process can be carried out in various ways, including:

  • Phone Wiretapping: Intercepting telephone conversations through devices connected to phone lines or telephone exchanges.
  • Computer Network Monitoring: Recording network traffic to capture data transmitted over the Internet.
  • Email Message Interception: Unauthorized access to email accounts for the purpose of monitoring communications.

Software and Surveillance Tools

WireShark is a well-known electronic surveillance software that allows monitoring and analyzing network traffic. Analysts can use WireShark to capture data packets transmitted over a network and analyze them to extract relevant information. For example, if an analyst suspects that an individual is communicating with suspicious organizations, they can use WireShark to analyze network traffic for evidence.

Alternatively, or in conjunction, an IMSI Catcher (International Mobile Subscriber Identity catcher) can be used. It’s a device used to intercept mobile communications. It works by emulating a cell tower and forcing nearby mobile phones to connect to it. This allows the operator of the IMSI catcher to intercept communications, including SMS messages and voice calls. Analysts can use IMSI catchers to gather information from mobile phones in a specific area.

Photo by Antoine Barrès on Unsplash

Open Source Collection and OSINT

Collecting data from open sources is an essential part of intelligence activities, enabling analysts to obtain information from websites, social media, and online publications. Here’s how this operational methodology works with Apache Nutch, Scrapy, NLTK, and Lexalytics.

Apache Nutch is an open-source framework for collecting data from the internet. It is widely used to create web search engines and extract data from websites. Nutch can be configured to visit websites, download pages, analyze them, and extract structured or unstructured data. Below is an example of how a Nutch project could be configured to collect data from a website:

<!-- Nutch Configuration File nutch-site.xml -->
<configuration>
<property>
<name>http.agent.name</name>
<value>NutchAgentName</value>
</property>
<property>
<name>plugin.includes</name>
<value>protocol-http|urlfilter-regex|parse-(html|tika)|index-(basic|anchor)|urlnormalizer-(pass|regex|basic)|scoring-opic</value>
</property>
</configuration>

Scrapy is a Python web scraping framework used to extract structured data from websites. A basic Scrapy spider might look like this:

import scrapy

class ExampleSpider(scrapy.Spider):
name = 'example'
start_urls = ['https://www.example.com']

def parse(self, response):
title = response.css('h1::text').get()
paragraphs = response.css('p::text').getall()
yield {
'Title': title,
'Paragraphs': paragraphs
}

NLTK is a Python library used for natural language processing. It can be employed for text analysis and information extraction. Here’s an example of how NLTK can be used for text analysis:

import nltk
from nltk.tokenize import word_tokenize

text = "This is an example of text to be analyzed with NLTK."
words = word_tokenize(text)
pos_tags = nltk.pos_tag(words)
print(pos_tags)

Extracting Meanings and Emotions from Texts

Lexalytics is a semantic analysis software that can be used to extract meanings and emotions from texts.

For example, in the case where an intelligence agency has acquired a large volume of textual documents from various open sources, such as news articles, blogs, and social media posts, in a foreign language. These documents may contain information potentially relevant to intelligence activities but are written in a language that is not understandable to the analysts.

In this scenario, Lexalytics can be used to analyze and extract useful information from these documents.

  1. Data Preparation: First, the documents are prepared for analysis. This may include removing formatting elements, standardizing text, and breaking down documents into paragraphs or sentences.
  2. Sentiment Analysis: Lexalytics can be used to assess the sentiment of the texts. This means the software can determine whether a document or a portion of it expresses positive, negative, or neutral opinions. For example, it might be able to detect whether an article contains favorable or unfavorable opinions on a specific topic.
  3. Entity Extraction: Lexalytics can extract entities from the texts, such as names of people, organizations, or places. This helps analysts identify who or what is mentioned in the documents.
  4. Relationship Analysis: The software can also analyze how entities are connected to each other in the documents. For instance, it might detect that a particular organization is involved in a specific event mentioned in the documents.
  5. Theme Analysis: Lexalytics can identify the main themes addressed in the documents. For example, it might detect that a series of articles are related to foreign policy or national security.
  6. Classification: The software can be trained to classify documents into specific categories. For example, it could be used to identify which documents are directly related to a potential threat and which are informative but not relevant to security.
  7. Reports and Visualizations: Lexalytics can generate reports and visualizations of the extracted data to allow analysts to examine the information more efficiently.
from lexalytics.storage import Storage
from lexalytics.nlp import NLP

# Initializing Lexalytics with API credentials
nlp = NLP("your_api_key")
storage = Storage("your_api_key")

# Text document in a foreign language
document = """
Here is an example of a text document in a foreign language. This document contains information
relevant to intelligence activities but is written in a language not understandable to analysts.
"""

# Data Preparation
document = document.strip() # Removing unwanted spaces and formatting characters
sentences = document.split('.') # Splitting the document into sentences

# Initialization of variables for analysis
entities = []
themes = []

# Analyzing each sentence in the document
for sentence in sentences:
results = nlp.entities(sentence) # Extracting entities from the sentence
entities.extend(results['entities'])

results = nlp.topics(sentence) # Identifying main topics
themes.extend(results['topics'])

# Printing identified entities and main topics
print("Entities:")
for entity in entities:
print(f"Text: {entity['text']}, Type: {entity['type']}")

print("\nMain Topics:")
for topic in themes:
print(f"Topic: {topic['label']}")

# Generating reports or data visualizations
# This part of the code can be customized according to the intelligence agency's needs.

In this example, the Python code utilizes Lexalytics API to analyze the foreign language text document. The document is prepared by removing unwanted spaces and formatting characters, and then it’s split into sentences. Subsequently, the code performs the analysis of entities and main topics within each sentence. The results are collected and can be further processed or visualized based on the intelligence agency’s requirements.

Another example could be:

from lexalytics.storage import Storage
from lexalytics.nlp import NLP

# Lexalytics Initialization
nlp = NLP("your_api_key")
storage = Storage("your_api_key")

# Text to analyze
text = "The G20 summit will be held in Rome in 2024."

# Text Analysis
results = nlp.entities(text)

# Entity Extraction
entities = results['entities']

# Print Entities
for entity in entities:
print(f"Entity: {entity['text']}, Type: {entity['type']}")

This code uses the Lexalytics API to analyze a sample text and identify the entities it contains, returning their text and type (e.g., person, place, or date). Lexalytics also offers advanced features such as sentiment analysis and extracting key themes from texts. Lexalytics can be used to analyze and extract relevant information from thousands of real-time social media posts. Here’s an example of Python code illustrating this process:

from lexalytics.storage import Storage
from lexalytics.nlp import NLP

# Initializing Lexalytics with API credentials
nlp = NLP("your_api_key")
storage = Storage("your_api_key")

# Sample social media post to analyze
post = "The debate on cybersecurity is on the rise. #Cybersecurity #CyberThreats"

# Text analysis
results = nlp.sentiment(post)

# Extracting sentiment
sentiment = results['sentiment']
print(f"Sentiment: {sentiment}")

The output from Lexalytics, in the form of structured data and analysis, is valuable for analysts as it allows them to quickly examine and understand the content of foreign language documents, identifying crucial information for intelligence activities. Moreover, Lexalytics can be customized and tailored to the specific needs of an intelligence agency to ensure the relevance and accuracy of data analysis.

Data collection from open sources and subsequent analysis of the extracted data are fundamental activities in intelligence. These tools and frameworks greatly simplify the process, enabling analysts to efficiently obtain relevant information from a wide range of online sources.

Photo by Amanda Dalbjörn on Unsplash

Data Analysis and Interpretation

Data analysis and interpretation are crucial in the work of intelligence analysts as they allow for the identification of trends, the discovery of threats, and the revelation of opportunities.

IBM SPSS is widely used software for statistical analysis and data mining. To perform data analysis with SPSS, you can use a graphical interface or write scripts. Here’s an example of a simple SPSS script to calculate the mean of a dataset:

DESCRIBE VARIABLES=Variable1 Variable2.
MEANS TABLES=Variable1 Variable2.

R is a programming language and environment for statistical analysis and graphics. It can be used to perform advanced data analysis. Here’s an example of R code to create a scatterplot:

# Loading a sample dataset
data <- mtcars

# Creating a scatterplot between two variables
plot(data$mpg, data$hp, main="Scatterplot MPG vs. HP", xlab="Miles per Gallon", ylab="Horsepower (HP)")

Python is an extremely flexible and widely used programming language for data analysis. TensorFlow is a machine learning library developed by Google. Here’s an example of Python code that uses TensorFlow to train a classification model:

import tensorflow as tf
from tensorflow import keras

# Loading a sample dataset
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()

# Defining a neural network model
model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(128, activation='relu'),
keras.layers.Dense(10, activation='softmax')
])

# Compiling the model
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])

# Training the model
model.fit(train_images, train_labels, epochs=5)

Scikit-learn is a Python library for machine learning and data analysis. Here’s an example of code that uses scikit-learn to train a support vector classifier:

from sklearn import datasets
from sklearn.svm import SVC
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score

# Loading a sample dataset
iris = datasets.load_iris()
X = iris.data
y = iris.target

# Splitting the data into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Creating and training a support vector classifier
clf = SVC()
clf.fit(X_train, y_train)

# Evaluating the classifier's performance
y_pred = clf.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)

Data analysis is essential for identifying patterns, relationships, and meaningful insights. These tools and languages provide analysts with the ability to analyze complex data and make informed decisions based on the information gathered.

Photo by Kevin Solbrig on Unsplash

Encryption and Communication Security

Encryption and communication security are essential to protect the confidentiality and integrity of information exchanged between analysts and sources or their superiors. Some well-known software includes:

  • PGP (Pretty Good Privacy): PGP is widely used public-key encryption software for securing emails and other sensitive data. It allows users to encrypt their messages and digitally sign them for authenticity.
  • Signal: Signal is an instant messaging application that offers end-to-end encrypted messages, encrypted voice calls, and secure video calls.
  • BitLocker: BitLocker is data encryption software developed by Microsoft for hard drive protection. It’s often used to encrypt the entire disk or specific partitions.
  • VeraCrypt: VeraCrypt is open-source software that allows you to create encrypted volumes for data protection. It can be used to encrypt hard drives, USB drives, and more.

Custom Solutions

Intelligence agencies often develop custom encryption and communication security solutions to meet their specific needs. These custom solutions are not accessible to the public and are kept strictly confidential. Such custom solutions may involve:

  • Custom Symmetric or Asymmetric Key Cryptography: Creating custom encryption algorithms and protocols to ensure communication security.
  • Key Management Systems: Sophisticated systems for managing and distributing encryption keys, ensuring that only authorized individuals can decrypt messages.
  • Custom VPN Tunneling: Developing customized Virtual Private Network (VPN) solutions to secure communications between endpoints.
  • Sensitive Document Encryption: Encrypting sensitive documents or files for secure storage and transmission.

Access to these custom solutions is strictly regulated and limited to authorized personnel. The use of such solutions is bound by stringent security protocols and may require two-factor authentication or advanced verification procedures.

Photo by Mauro Sbicego on Unsplash

Information Sharing

Sharing information among analysts, agents, and agencies is essential for the success of intelligence operations. However, it is important to do so securely and secretly to prevent unauthorized disclosure of information. Agencies often develop custom data sharing solutions to meet their specific needs.

Intelligence agencies can also rely on third-party services for secure information sharing. These services provide advanced encryption and authentication solutions to ensure that only authorized individuals can access the data. For instance, an agency might use secure cloud services like Tresorit or encrypted file sharing services like VeraCrypt to share sensitive information.

Plausible deniability

Plausible deniability is a common practice in intelligence activities. It involves denying or disavowing involvement in an operation or the use of particular software or services. This can be useful for protecting the anonymity of agents or ensuring that activities remain secretive. Plausible deniability can be achieved through the use of intermediaries, the creation of false leads, or the obfuscation of information to conceal the actual source or the use of specific technology.

The balance between the need for national security and the respect for civil rights is a critical issue. In an era where technology has made widespread surveillance and massive data collection possible, important questions about privacy and ethics arise. As individuals, we must consider how much we are willing to sacrifice in terms of privacy to ensure society’s security. What control measures are necessary to prevent abuses of power?

Furthermore, transparency and accountability of intelligence agencies raise questions about democratic oversight. Who watches the watchers? The accountability of agencies is a significant concern, as the lack of oversight could lead to abuses of power.

Lastly, the line between traditional intelligence and cybersecurity is becoming blurred. Cyber threats present complex challenges and require global cooperation. What international regulations can balance security and privacy in the digital age?

As a society, we must continue to critically examine these issues and ensure that intelligence activities genuinely serve the common good, while respecting democratic values and human rights.

“In the digital age, intelligence is the double-edged sword that can protect or threaten individual freedom.” Edward Snowden

To support me and read practical articles and tutorials, you can visit my BuyMeACoffee page.

--

--

Appassionato di politica internazionale, culture del mondo, di cinema e videogame. Consulente SMEM. Operatore OSINT e SOCMINT.