Monitors are generally becoming wider. Most monitors these days fall somewhere in the range: 1024x768 (old desktops), 1280x720 (older laptops and tablets), to 1920x1080 (modern monitors, TVs, laptops).
When your application starts, check the resolution of your program's window (not screen size since some users may run the application in windowed mode). Calculate the width/height ratio.
Assuming you have a 1920x1080 (16:9) texture to work with and you detect a monitor resolution ratio of 1.77+-10%, then create a 16:9 background image/texture.
If the monitor resolution ratio is 1.33+-10%, then create a 4:3 background image using only the central portion of the image.
Else check if the resolution ratio is some other popular ratio, else check another ratio, etc...
In other words, for a 4:3 window, only use the central 1440x1080 portion of your texture and create a sprite to match the width and height of the window. If you are using a orthogonal view matrix, create a quad that has corners at -1,-1; -1,1; 1,-1; 1,1; (I don't remember, is [-1,1] standard OpenGL orthogonal matrix corners? I haven't worked with orthogonal matrices lately).
Then just let the graphics card scale the image down to the size of the window it is drawing in. As long as you have the ratio relatively correct and use the right scaling parameters on the texture, the graphic card will do a pretty good job of scaling the image.
As you mentioned, if you are scaling a small image up, you will still have to clip some portion of it, either the top and bottom edges if the window has a much wider aspect ratio than the image, or the left and right edges if the window has a much narrower aspect ratio than the image.
This is the solution I've seen in a number of 2D games when I change the resolution/resize the window. Granted this means that people with older monitors will not see the left and right edges of your beautiful backgrounds, but you can compensate for this by making the left and right edges something unimportant like some shrubbery or the continuation of a castle wall, or some mountains, etc.
In regards to performance, I wouldn't take the drop from 4,000 fps to 2,000 fps too seriously. Most frame rates over 600-800 act very erratically in my experience.
A frame rate of 2,000 is once every 0.5 milliseconds! A single frame involves hundreds of thousands to tens of millions of mathematical operations (a 1080p display has 2 million pixels!).
Small changes in the type of data being processed, such as a switch from geometry heavy environments to texture heavy environments can make huge changes in GPU memory access patterns, caching, parallelization, etc.
I've seen strange fps gains/loses in my own programs at high (+1,000) fps. However, once you start adding multiple textures, geometry involving thousands of vertices, etc. these types of fluctuations tend to even out around 600-800 fps and adding X percent more objects on screen starts translating to equivalent performance drops.
*note these observations are on my own PC and are not conclusive benchmarks or cited facts of any kind