It looks like Windows 11 might make some useful changes for folks who are running multi-monitor setups, plus the OS could push forward with gesture controls on touchscreens, too.
As Windows Latest (opens in new tab) flags up, those who use more than one monitor can encounter frustrating situations with Windows 10 when their PC goes to sleep, because upon waking, some running apps (or tabs) can be shifted to a different place on the screen (or crammed together on one display).
- Windows 11 release date, news, and features
- How to speed up Windows 10
- We show you how to uninstall a Windows 10 update
That’s going to be annoying, of course, but in the leaked build of Windows 11 which surfaced last week, under Display settings, Microsoft has added an option to ‘Remember window locations based on monitor connection’ which should resolve this bugbear.
A second option has been added in this area of the Settings app, and this will automatically minimize all open windows when you disconnect a secondary monitor and everything reverts to the primary display.
Windows Latest also spotted (opens in new tab) a document from a Microsoft program manager on GitHub (opens in new tab) about exploring the use of “additional touch-based gestures in Windows” to open up further touch functionality.
Specifically, this concerns three or four finger trackpad-style gestures on the touchscreen, which would be routed to the operating system directly – so Microsoft is asking devs whose apps use these gestures about how that might impact their software and user experience.
Note that the idea would be that users who don’t want the OS to step in and use these gestures could switch the option off, and when running an app that did use these multi-fingered swipes, the OS could pop up a dialog asking what the user wishes to do (let the app own the gesture, or Windows).
These are just ideas Microsoft is kicking around at the moment, though, and they might not even be planned for Windows 11, but rather some future version of the desktop OS.
Future incarnations of Windows will not only work to advance gestures in this way, but will also push forward with ‘user presence detection’, meaning giving hardware the ability to sense whether or not the user is there in front of it (and to log on or lock accordingly). We’ve already heard about Windows 11 possibly coming with a ‘Wake on Touch’ feature, of course.
- These are the best laptops of 2021