Hi,
I have a program that's been running on Windows 7, 8 and 10 and I just implemented an on screen keyboard (OSK) feature for tablet use that's integrated with one of my dialogs because using the OSK that came with windows was too clumsy to use as I had to constantly keep moving it around in order to get to the fields that I was trying to fill out.

My OKS is integrated with a dialog that pops up after clicking a button on a parent dialog (so it's main window > dialog1 > dialog2 > OSK_Ctrl > button_label) and all works OK on my Windows 7 development machine (with or without a 30" monitor) and my Surface 3 (Windows 8) but NOT on my Surface 4 (Windows 10) or a Windows 10 laptop. On my Surface 4 (and the Windows 10 laptop), depending on the position of my OSK dialog on the screen, I have to click about 60mm above the actual button in order for the button to be identified correctly.

I then moved the dialog to the top & left edges (to remove any possible offsets) but that didn't help however if connect my Surface 4 to the 30" monitor than all is OK, i.e. the button is identified correctly, although connecting my Windows 10 laptop to a monitor didn't make any difference, the problem was still there.

To identify a button I basically compare the X & Y provided by the Hook event with each of the button coordinates and if they match it's considered a hit. Button coordinates are basically 'dialog1.left + dialog2.left + OSK_Ctrl.Left + button_label.left' and similar for the Top property.

I then decided to get the hook out of the equation and just printed a message of my dialog position (Form.Top & .Left properties) and I get a different result on my Surface 4 depending on whether it's connected to the 30" monitor or not.

I'm really at a loss as to what it could be and and would really appreciate some help.


Thanks,
Mike