Gaussian blur in real time?

Advanced game related topics
Joubarbe
Enthusiast
Enthusiast
Posts: 555
Joined: Wed Sep 18, 2013 11:54 am
Location: France

Re: Gaussian blur in real time?

Post by Joubarbe »

The structure of most of my 2D programs:

Code: Select all

EnableExplicit

#screen_width = 1920
#screen_height = 1080
#app_title = "TITLE"

Define sprite_mouse.i, sprite_fps.i
Define frame.i, frame_refresh, frame_time.i, event.i
Global delta.d

InitKeyboard() : InitMouse() : InitSprite()

OpenWindow(0, 0, 0, #screen_width, #screen_height, #app_title, #PB_Window_BorderLess | #PB_Window_ScreenCentered)
OpenWindowedScreen(WindowID(0), 0, 0, #screen_width, #screen_height, #False, 0, 0, #PB_Screen_SmartSynchronization)

sprite_mouse = CreateSprite(#PB_Any, 7, 7)
StartDrawing(SpriteOutput(sprite_mouse))
Box(0, 0, OutputWidth(), OutputHeight(), #Black)
Box(0, 0, OutputWidth() - 2, OutputHeight() - 2, #White)
StopDrawing()

sprite_fps = CreateSprite(#PB_Any, 40, 20)

Repeat
  Repeat
    event.i = WindowEvent()
    If event = #PB_Event_CloseWindow : Break 2 : EndIf
  Until event = 0
  
  ClearScreen(RGB(50, 65, 70))
  
  ExamineKeyboard() : ExamineMouse()
  MouseLocate(DesktopMouseX(), DesktopMouseY())
  
  StartDrawing(ScreenOutput())
  DrawingBuffer() ; PERF
  StopDrawing()
  
  DisplaySprite(sprite_fps, 0, 0)
  DisplaySprite(sprite_mouse, MouseX(), MouseY())
  
  ; FPS and Delta Time
  frame + 1
  If ElapsedMilliseconds() > frame_refresh
    StartDrawing(SpriteOutput(sprite_fps))
    DrawText(0, 0, "fps:" + frame)
    StopDrawing()
    frame_refresh = ElapsedMilliseconds() + 1000
    frame = 0
  EndIf
  delta = (ElapsedMilliseconds() - frame_time) / 1000
  frame_time = ElapsedMilliseconds()
  
  FlipBuffers()   
Until KeyboardReleased(#PB_Key_Escape)
40 FPS. (without Debugger)

EDIT: Ok, with DirectX11 without Debugger, there's no cost, steady 60FPS. On OpenGL, I drop at 30 and the DirectionalBlur module gives me black and white pixels. Weird.
wilbert
PureBasic Expert
PureBasic Expert
Posts: 3870
Joined: Sun Aug 08, 2004 5:21 am
Location: Netherlands

Re: Gaussian blur in real time?

Post by wilbert »

Joubarbe wrote:EDIT: Ok, with DirectX11 without Debugger, there's no cost, steady 60FPS. On OpenGL, I drop at 30 and the DirectionalBlur module gives me black and white pixels. Weird.
It's explained in the PB help page for ScreenOutput().
OpenGL doesn't allow direct buffer access.
So when using OpenGL, the whole screen buffer is copied to main memory and back again when StopDrawing() is called.
This is what makes it slow on OpenGL.
Windows (x64)
Raspberry Pi OS (Arm64)
#NULL
Addict
Addict
Posts: 1440
Joined: Thu Aug 30, 2007 11:54 pm
Location: right here

Re: Gaussian blur in real time?

Post by #NULL »

Joubarbe wrote:But there must be something I don't get, because this obviously doesn't work. Nothing get blurred, even though the CPU is working.
You would have to either copy the 2nd buffer back to the drawing buffer or pass the drawing buffer address directly to the blur function. Currently it looks like your are just blurring the copied memory but you don't do anything with it.
Post Reply