this post was submitted on 23 Aug 2023
74 points (95.1% liked)

Programming

17448 readers
67 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

The title would probably be confusing, but I could not make it better than this. I noticed that most programming languages are limited to the alphanumerical set along with the special characters present in a general keyboard. I wondered if this posed a barrier for developers on what characters they were limited to program in, or if it was intentional from the start that these keys would be the most optimal characters for a program to be coded in by a human and was later adopted as a standard for every user. Basically, are the modern keyboards built around programming languages or are programming languages built around these keyboards?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago (2 children)
[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

He's not completely, though. @marcos had it right about co-evolution -- leaving aside any issues of internationalization, the layout of letters came from typewriters, but the layout of punctuation available was different on different computers for a lot of the early history of programming. Some of the more extreme examples were the Space Cadet Keyboard used at MIT, and APL which more or less required you to use an APL-specific keyboard in order to be able to access all the special symbols that are part of APL. Here's an APL program:

⎕CR 'ProveNonPrime'
Z←ProveNonPrime R
⍝Show all factors of an integer R - except 1 and the number itself,
⍝ i.e., prove Non-Prime. String 'prime' is returned for a Prime integer.
Z←(0=(⍳R)|R)/⍳R  ⍝ Determine all factors for integer R, store into Z
Z←(~(Z∊1,R))/Z   ⍝ Delete 1 and the number as factors for the number from Z.
→(0=⍴Z)/ProveNonPrimeIsPrime               ⍝ If result has zero shape, it has no other factors and is therefore prime
Z←R,(⊂" factors(except 1) "),(⊂Z),⎕TCNL  ⍝ Show the number R, its factors(except 1,itself), and a new line char
→0  ⍝ Done with function if non-prime
ProveNonPrimeIsPrime: Z←R,(⊂" prime"),⎕TCNL  ⍝ function branches here if number was prime

Things became much more standardized when the IBM PC's keyboard became the norm, and were formalized in 1995 with ISO 9995. Then once it stabilized there was a strong incentive for both language designers and keyboard makers to stick with what everyone was used to so they could keep working with the other. But it wasn't always that way.

Edit: Here's what things looked like on an IBM 3276:

(Full size image)

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (2 children)

Fun fact: the standard qwerty layout was made to slow typewriter typing down by putting common keys off the home row and apart from each other. This was done to prevent the little key arm thingies from colliding and jamming when typing quickly.

EDIT: Apparently this is not a fact

[–] [email protected] 8 points 1 year ago (1 children)

The point wasn't to slow down typists, but to reduce the number of bigrams (two-letter sequences) that would be typed with adjacent keys, since that's the specific movement that's most likely to cause the key levers to jam.

[–] [email protected] 1 points 1 year ago

Not true. The current layout is the result of years of evolution based on feedback by typists and vendors.