Chaos Engineering – File Connection Leak

Many Java applications still use files for importing and exporting data. If the connections to these files are not properly managed, it can lead to a significant number of connections leaking, causing the application to slow down or even crash.

In our series of chaos engineering articles, we have been learning how to simulate various performance problems. In this post, let’s discuss how to debug and identify the root causes of file connection issues.

Sample Program

Here is a sample program from the open-source BuggyApp application, which is leaking file connections and may eventually lead to out-of-connection exceptions.

The below code demonstrates a leak where connections to files, fail to close the connection afterwards. This can result in a file query leak and, ultimately, an ‘out of connections’ error.

A thread will enter into a BLOCKED state when it can’t acquire a lock on an object because another thread already holds the lock on the same object and doesn’t release it. Review the program carefully.

/**
	 * Connects to a sample file and does it close it.
	 */
	public void leakConnection() {
		BufferedReader reader = null;
		try {
			System.out.println("Leaking File connections new");
			reader = new BufferedReader(new FileReader(SAMPLE_FILE_NAME));
			Thread.sleep(1000);
			String line;
			while ((line = reader.readLine()) != null) {
				//IO operations
			}
		} catch (Exception e) {
			e.printStackTrace();
		}
}

Diagnosing File Connection leaks

You can diagnose file connection leaks through manual review or by using readily available root cause analysis tools.

Manual Methods

The manual methods for detecting file connection leaks can be using list open files (lsof) operating system commands.

If the number of connections constantly increasing, it indicates a potential growth in the connection count and will eventually result in connection exceptions.

lsof  |  grep 24597
ava      24597 userdb    5r      REG                1,4        151            39676280 /Users/userdb/Documents/Tier1App/buggyapp/buggyapp-samplefile.txt
java      24597 userdb    6r      REG                1,4        151            39676280 /Users/userdb/Documents/Tier1App/buggyapp/buggyapp-samplefile.txt
java      24597 userdb    7r      REG                1,4        151            39676280 /Users/userdb/Documents/Tier1App/buggyapp/buggyapp-samplefile.txt
java      24597 userdb    8r      REG                1,4        151            39676280 /Users/userdb/Documents/Tier1App/buggyapp/buggyapp-samplefile.txt
java      24597 userdb    9r      REG                1,4        151            39676280 

When connections are leaking and reach the maximum allowed connections, errors in the logs can indicate a possible connection leak.

Automated Methods

You can use root cause analysis tools such as yCrash, which automatically captures a 360-degree view of the system. For file connection leaks, it can help capture and alert on connection failures or high resources resulting in spike CPU usage resulting in to high number of leaked connections.

Below is yCrash highlighting spike in CPU usage resulting from a Java process with file connection leaks:

High CPU usage
Fig 1: yCrash showing high CPU usage

Summation

In this article, we explored the risks of file connection leaks in Java applications. These leaks can lead to performance issues and ‘out of connections’ errors. We discussed manual methods like using “lsof” to monitor connection counts and automated tools such as yCrash to detect leaks. Addressing file connection leaks is vital to maintain application stability and reliability.

Leave a Reply

Powered by WordPress.com.

Up ↑

%d