If you reduce the monitor brightness from 100% to 70%, you can save up to 20% of the energy the monitor uses.
I use an 8 year old Apple 30" monitor (see below). I measured how much energy it is using when in full brightness, and it's 145W per hour. When I lower brightness to 70%, consumption goes down to 98W! which is about 32% less energy. This goes down further to 73W when on 50% brightness, but it looks a bit dim. 70% brightness looks ideal, and would save me about 1.8 kWh a week!
As usual, it depends. You will need to measure it. The efficiencies involved should be equivalent to the difference between Fluorescent and LED room lighting.
I imagine consumption will be a similar reduction proportionally. However, as the total use is lower to start with, the saving will not be as much in absolute terms.
Have a look at my answer here for more info. I have a few monitors with different back-light tech, but they are different sizes so it's not a fair comparison.
Reducing monitor brightness is a good idea, but less relevant with modern LED back-lighting than with older CCFL (or even CRT) technology. Still worth doing but the savings will be less.
Never use a screensaver! These are irrelevant today and just burn energy. If a screen is not in use then it should be off. If on Windows then you can use a tool such as github.com/ukanth/monitores to turn off screens when you lock the computer (Win+L). You can also use a service such as shutdownscanner.com to monitor how many computers have been left on (this is one of my projects).