It comes up every day of my life, and I am befuddled as to why it isn’t a slickly solved problem.
I need to see what people are talking about on their screen.
I want to talk with a person and then see their screen on my phone or iPad. Or, I want to talk with a person and have them see my iPhone/iPad screen on their computer or phone.
Why is is so crazy impossible to see what people are talking about on their phone and far from trivial on their computers? Why is it impossible for me to show others how to do something, or to show come information?
I know there are remote access packages out there, and that there are conferencing services. But the klunky, time consuming, complicated set-up time precludes their use 103% of the time.
When somebody calls me about a problem, I want to discuss it with them and see what the situation is. I don’t ant them to read me what is on their screen, and I certainly don’t want to have to use a paper and pen to write down what they are reading to me.
I am tired of having people hold their iPhone cameras up to their computer monitor, with the darn thing going constantly in and out of focus.
Yes, I know there are hundreds of ways to do this, but that’s not a solution, that’s another problem.
This should be a core OS feature (desktop and mobile), and not one that only works with like OS devices. Can’t they activate something on their computer, or can’t there be a QR code, or something so we can work together? This should be an core feature of the modern inherent, not an app ecosystem opportunity.