Readspeed Test v3.0 HD/OS buffered and unbuffered speeds.

Applications, Games, Tools, User libs and useful stuff coded in PureBasic
User avatar
Rescator
Addict
Addict
Posts: 1769
Joined: Sat Feb 19, 2005 5:05 pm
Location: Norway

Readspeed Test v3.0 HD/OS buffered and unbuffered speeds.

Post by Rescator »

This is based on the code from here v2.x but v3 is Windows only, sorry about that.

The issue with measuring read speed from HDD/SSD/SATA/USB devices is that Windows has a very aggressive cache. From that thread Fred says "IIRC Windows uses its whole free memory as disk cache. It's shadow mem, which means it's not visible to taskmanager or such. It gets automatically released when an app claims memory."

But luckily I managed to find a way to selectively turn off Windows file caching for individual files.
This means no nasty tricks like allocating all the system memory to force windows to flush the file cache,
and no need for a Administrative privilege program that uses special system APIs to mess with the OS system caching and memory handling,
or worse, use FILE_FLAG_NO_BUFFERING and do raw unbuffered I/O which requires you to align memory and do seeks on multiples the physical sector size of the medium (disk) which is a huge pain to code.
The solution below is very simple but not that well known (from what I've seen on the net so far).

The v3 code enables testing of unbuffered random, unbuffered sequential, buffered random and buffered sequential.
A large sets of test files with varying sizes is suggested (the files in a messy download folder is a nice collection of test files for example).
Do note that since 4 different tests and 13 chunk sizes are being tested that the test run will take a while, and ideally you should not disturb the test, at the very least try not to do any other activity on the same disk you are testing on, and avoid heavy CPU use or heavy activity on others disks in the system.

Here is one example: Google chart API image of this test.

Code: Select all

Readspeed Test v3.0
---------------------------
The baseline is the slowest chunksize, and shown as 100.00 %
Based on 5 files and 52.26 MB of data (processed 2717.29).
Test took 0.869 minutes, speed was 52.42 MB/s (419.38 mbit/s).
Unbuffered Speed was 26.95 MB/s (215.62 mbit/s).
Buffered Speed was 952.77 MB/s (7622.13 mbit/s).

Chunksize = 4096, speed = 100.00 %
Chunksize = 8192, speed = 150.93 %
Chunksize = 16384, speed = 249.01 %
Chunksize = 32768, speed = 376.61 %
Chunksize = 65536, speed = 570.71 %
Chunksize = 131072, speed = 804.77 %
Chunksize = 262144, speed = 1017.99 %
Chunksize = 524288, speed = 1239.46 %
Chunksize = 1048576, speed = 1444.23 %
Chunksize = 2097152, speed = 1599.42 %
Chunksize = 4194304, speed = 1643.98 %
Chunksize = 8388608, speed = 1475.18 %
Chunksize = 16777216, speed = 1451.85 %
The first unbuffered reads should show roughly the speed of the device itself, the buffered reads should not touch the disk at all if you have a lot of memory and is normally only limited by memory/CPU/bus speeds; in tests here the disk is not touched even if there are several GB total of files that was read.
The buffered reads are more similar in speeds and do not affect the test too much, but in the real world a fresh read and later a buffered read may occur so testing both makes sense.

Also note that for the random test, if the file is small enough then no random read is done as the entire file may fit in the readchunk buffer, this is why you should have a good mix of varying filesizes ranging from a few KB to a GB or so.

WinAPI_ReleaseFileCache() is a procedure i made that uses the Windows API CreateFile() function in write mode with the flag FILE_FLAG_NO_BUFFERING, nothing is actually written and the file handle is closed at once. This behavior is undocumented (as far as I can tell) but seems to work on Windows 2000/XP and up.

Have fun and hopefully this test program and the source is useful to somebody. What started as a simple idea/program turned out way to complicated than I though, but I did learn something. It's possible to uncache individual files on Windows, which is kind of cool, especially for test tools like these.

Code: Select all

;Test Readspeed v3.0, Public Domain.
EnableExplicit

DisableDebugger
#Program_Title = "Readspeed Test v3.0"

CompilerIf #PB_Compiler_OS <> #PB_OS_Windows
	CompilerError "Sorry! No idea how to do unbuffered and buffered API file reads on Mac/Linux, this code is Windows only"
CompilerEndIf

Enumeration 1
	#File_1
EndEnumeration

Global result$, url$

Procedure.i WinAPI_ReleaseFileCache(file$)
	Protected fh.i

	file$ = "\\?\" + file$ ;Full (unicode) paths only, this means relative paths are not supported.
	fh = CreateFile_(file$, #GENERIC_READ | #GENERIC_WRITE, #Null, #Null, #OPEN_EXISTING, #FILE_FLAG_NO_BUFFERING, #Null)
	If fh = #INVALID_HANDLE_VALUE
		;GetLastError_()
		fh = #Null
	EndIf

	If CloseHandle_(fh) = #False
		;GetLastError_()
		ProcedureReturn #False
	EndIf

	ProcedureReturn #True
EndProcedure

Structure speedtest_struct
	seq_unbuffered.q
	seq_buffered.q
	rnd_unbuffered.q
	rnd_buffered.q
	chunksize.l
EndStructure

Structure files_struct
	size.q
	name$
EndStructure

Procedure.i ReadspeedTest()
	Protected *mem
	Protected text$
	Protected size.q, totalsize.q, flen.q, filesizetotal.q
	Protected index.i
	Protected base.d, sum.d, maxvalue.d, bufferedsum.d, unbufferedsum.d, speedsum.d
	Protected start.l, stop.l, datestart.l, datestop.l, chunksize.l
	Protected NewList speedtest.speedtest_struct()
	Protected Newlist file.files_struct()
	
	text$ = OpenFileRequester("Choose file(s) for readspeed test:", "", "All (*.*)|*.*", 0, #PB_Requester_MultiSelection)
	While text$
		size = FileSize(text$)
		If size > 0
			AddElement(file())
			file()\name$ = text$
			file()\size = size
			filesizetotal + size
		EndIf
		text$ = NextSelectedFileName()
	Wend
	If ListSize(file()) < 1
		ProcedureReturn #False
	EndIf

	FileBuffersSize(#PB_Default, 0)

	*mem=AllocateMemory(16777216, #PB_Memory_NoClear)
	If *mem = #Null
		ProcedureReturn #False
	EndIf

	chunksize = 4096
	Repeat
		AddElement(speedtest())
		speedtest()\chunksize = chunksize
		chunksize << 1
	Until (chunksize > MemorySize(*mem))

	datestart = ElapsedMilliseconds()
	ForEach speedtest()
		chunksize = speedtest()\chunksize
		ForEach file()
			ConsoleTitle(#Program_Title + ", file " + Str(ListIndex(file()) + 1) + " of " + Str(ListSize(file())) + ", chunksize = " + Str(chunksize) + " bytes, test " + Str(ListIndex(speedtest()) + 1) + " of " + Str(ListSize(speedtest())) + ".")
			flen = file()\size
			PrintN(GetFilePart(file()\name$) + " " + Str(flen) + " bytes.")
			;Unbuffered random test.
			WinAPI_ReleaseFileCache(file()\name$)
			If ReadFile(#File_1, file()\name$)
				size = flen
				totalsize + size
				If flen > chunksize
					start = ElapsedMilliseconds()
					Repeat
						If flen > chunksize
							FileSeek(#File_1, Random(flen - chunksize))
						EndIf
						size - ReadData(#File_1, *mem, chunksize)
					Until size <= 0
					stop = ElapsedMilliseconds()
				Else
					start = ElapsedMilliseconds()
					Repeat
						size - ReadData(#File_1, *mem, chunksize)
					Until size <= 0
					stop = ElapsedMilliseconds()
				EndIf
				speedtest()\rnd_unbuffered + (stop - start)
				CloseFile(#File_1)
			EndIf
			;Unbuffered sequential test.
			WinAPI_ReleaseFileCache(file()\name$)
			If ReadFile(#File_1, file()\name$)
				size = flen
				totalsize + size
				start = ElapsedMilliseconds()
				Repeat
					size - ReadData(#File_1, *mem, chunksize)
				Until size <= 0
				stop = ElapsedMilliseconds()
				speedtest()\seq_unbuffered + (stop - start)
				CloseFile(#File_1)
			EndIf
			;Buffered random test.
			If ReadFile(#File_1, file()\name$)
				size = flen
				totalsize + size
				start = ElapsedMilliseconds()
				Repeat
					If flen > chunksize
						FileSeek(#File_1, Random(flen - chunksize))
					EndIf
					size - ReadData(#File_1, *mem, chunksize)
				Until size <= 0
				stop = ElapsedMilliseconds()
				speedtest()\rnd_buffered + (stop - start)
				CloseFile(#File_1)
			EndIf
			;Buffered sequential test.
			If ReadFile(#File_1, file()\name$)
				size = flen
				totalsize + size
				start = ElapsedMilliseconds()
				Repeat
					size - ReadData(#File_1, *mem, chunksize)
				Until size <= 0
				stop = ElapsedMilliseconds()
				speedtest()\seq_buffered + (stop - start)
				CloseFile(#File_1)
			EndIf
		Next
	Next
	datestop = ElapsedMilliseconds()
	ConsoleTitle(#Program_Title + " Complete.")
	
	If *mem
		FreeMemory(*mem)
	EndIf
	
	base = 0.0
	ForEach speedtest()
		sum = (speedtest()\rnd_buffered + speedtest()\rnd_unbuffered + speedtest()\seq_buffered + speedtest()\seq_unbuffered) / 4.0
		If base < sum
			base = sum
		EndIf
	Next

	text$ = ""
	text$ + "The baseline is the slowest chunksize, and shown as 100.00 %" + #LF$
	size = ListSize(file())
	If size > 1
		text$ + "Based on " + Str(size) + " files and " + StrD(filesizetotal / 1048576.0,2) + " MB of data (processed " + StrD(totalsize / 1048576.0,2) + ")." + #LF$
	Else
		text$ + "Based on 1 file" + " and " + StrD(filesizetotal / 1048576.0,2) + " MB of data (processed " + StrD(totalsize / 1048576.0,2) + " MB)." + #LF$
	EndIf

	speedsum = 0.0
	ForEach speedtest()
		speedsum + (speedtest()\rnd_buffered + speedtest()\rnd_unbuffered + speedtest()\seq_buffered + speedtest()\seq_unbuffered)
	Next
	text$ + "Test took " + StrD((datestop - datestart) / 60000.0, 3) + " minutes, speed was " + StrD(totalsize / 1048576.0 / (speedsum / 1000.0), 2) + " MB/s (" + StrD(totalsize / 131072.0 / (speedsum / 1000.0), 2) + " mbit/s)." + #LF$
	
	unbufferedsum = 0.0
	ForEach speedtest()
		unbufferedsum + (speedtest()\rnd_unbuffered + speedtest()\seq_unbuffered)
	Next
	text$ + "Unbuffered Speed was " + StrD(((totalsize / 2.0) / 1048576.0) / (unbufferedsum / 1000.0), 2) + " MB/s (" + StrD(((totalsize / 2.0)  / 131072.0) / (unbufferedsum / 1000.0), 2) + " mbit/s)." + #LF$

	bufferedsum = 0.0
	ForEach speedtest()
		bufferedsum + (speedtest()\rnd_buffered + speedtest()\seq_buffered)
	Next
	text$ + "Buffered Speed was " + StrD(((totalsize / 2.0) / 1048576.0) / (bufferedsum / 1000.0), 2) + " MB/s (" + StrD(((totalsize / 2.0) / 131072.0) / (bufferedsum / 1000.0), 2) + " mbit/s)." + #LF$

	url$ = "http://chart.apis.google.com/chart?chtt=" + #Program_Title
	url$ + "&chs=388x773&chm=N*f*%,000000,0,-1,11&cht=bvg&chbh=a"
	url$ + "&chco=7FFFFF|00007F|7F7F7F|7F0000|007F00|7F7F00|007F7F|7F007F|FF7F7F|7F7FFF"
	url$ + "|7FFF7F|0000FF|00FF00|FF0000&chdl="
	ForEach speedtest()
		chunksize = speedtest()\chunksize
		If chunksize < 1024
			url$ + Str(chunksize) + "B"
		ElseIf chunksize < 1048576
			url$ + Str(chunksize / 1024) + "KB"
		Else
			url$ + Str(chunksize / 1048576) + "MB"
		EndIf
		If ListIndex(speedtest()) <> (ListSize(speedtest()) - 1)
			url$ + "|"
		EndIf
	Next
	url$ + "&chd=t:"

	ForEach speedtest()
		sum = (speedtest()\rnd_buffered + speedtest()\rnd_unbuffered + speedtest()\seq_buffered + speedtest()\seq_unbuffered) / 4.0
		If sum = 0
			sum = 1.0
		EndIf
		sum = base / sum
		If sum > maxvalue
			maxvalue = sum
		EndIf
		text$ + #LF$ + "Chunksize = " + Str(speedtest()\chunksize) + ", speed = " + StrD(sum *100.0, 2) + " %"
 		url$ + Str((sum * 100))
		If ListIndex(speedtest()) <> (ListSize(speedtest()) - 1)
 			url$ + ","
 		EndIf
	Next
	
	result$ = text$
 	url$ + "&chds=99," + Str((maxvalue * 100) + 1)
	
	ProcedureReturn #True
EndProcedure

Define result.i

If OpenConsole() = #False
	;Failed to open window.
	End
EndIf
ConsoleTitle(#Program_Title)

result = ReadspeedTest()

CloseConsole()
If result
	MessageRequester(#Program_Title, result$)
	RunProgram(URLEncoder(url$))
EndIf

End

PS! For those curious 128KB seems like a good middleground. If you go too large you might start to notice the memory overhead, but if too small then you will notice high CPU loads. Once you have found a chunksize that you like (test various HDD, SSD, USB devices and on multiple systems if possible) then set FileBuffersSize(#PB_Default, chunksize) to the desired size and you should have optimal buffering for your program and systems.