I know that people who speak languages other than American English will have some difficulties with this -- Unicode has been around long enough that there's little excuse for not supporting UTF16 on everything newer than, say, 5 years old -- but my own experience has taught me that the only safe option for universal interoperability is ASCII.
Yes, it's more than 50 years old. Yes, it's (originally) intended to control printers. Yes, it lacks typographic subtlety. Yes, it barely works for English. Yes, it's a 7-bit code (only 127 possible values, of which one is forbidden -- NUL, or 0 -- and 31 are reserved for control characters). But those 127 values are the bedrock of pretty nearly every other character mapping, on every device and operating system, so they display the same way and are properly interpreted everywhere.
Do not use accented characters. Do not use "curly" quotes (stick to typewriter-like ' and "). Do not use true dashes (stick to hyphens). And, in filenames, avoid wildcard characters (*, ?, !).
No superiority is implied or intended. This is an experiential commentary on the stupidity of much software that still runs the world. Recognizing it has saved me much grief.