Skip to main content

Windows 11 could have some love for touchscreens and multi-monitor setups

Surface Pro 7
(Image credit: Future)

It looks like Windows 11 might make some useful changes for folks who are running multi-monitor setups, plus the OS could push forward with gesture controls on touchscreens, too.

As Windows Latest flags up, those who use more than one monitor can encounter frustrating situations with Windows 10 when their PC goes to sleep, because upon waking, some running apps (or tabs) can be shifted to a different place on the screen (or crammed together on one display).

That’s going to be annoying, of course, but in the leaked build of Windows 11 which surfaced last week, under Display settings, Microsoft has added an option to ‘Remember window locations based on monitor connection’ which should resolve this bugbear.

A second option has been added in this area of the Settings app, and this will automatically minimize all open windows when you disconnect a secondary monitor and everything reverts to the primary display.

Goodwill gestures

Windows Latest also spotted a document from a Microsoft program manager on GitHub about exploring the use of “additional touch-based gestures in Windows” to open up further touch functionality.

Specifically, this concerns three or four finger trackpad-style gestures on the touchscreen, which would be routed to the operating system directly – so Microsoft is asking devs whose apps use these gestures about how that might impact their software and user experience.

Note that the idea would be that users who don’t want the OS to step in and use these gestures could switch the option off, and when running an app that did use these multi-fingered swipes, the OS could pop up a dialog asking what the user wishes to do (let the app own the gesture, or Windows).

These are just ideas Microsoft is kicking around at the moment, though, and they might not even be planned for Windows 11, but rather some future version of the desktop OS.

Future incarnations of Windows will not only work to advance gestures in this way, but will also push forward with ‘user presence detection’, meaning giving hardware the ability to sense whether or not the user is there in front of it (and to log on or lock accordingly). We’ve already heard about Windows 11 possibly coming with a ‘Wake on Touch’ feature, of course.

Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).