Well, it's a string that will guarantee unique allocations (two identical strings are guaranteed to be allocated at the same address), which makes equality checks super fast (compare pointers directly). But pretty much just a string nonetheless...
Lisp symbols have various properties; the exact set depends on the dialect. The properties may be mutable (e.g. property list, value cell).
You can certainly associate such things with an integer: e.g. value_cell[42], plist[42].
But those associations are part of the "symbolness", not just the 42.
Integers are not suitable symbols in some ways: they have a security problem.
Real symbols can be used for sandboxing: if you don't already hold the symbol, you have no way to get it.
This won't happen for integers. Especially not reasonably small integers.
What do I mean that if you don't already have a symbol, you have no way to get it? If the symbol is interned, you can intern it, right?
Not if there is a symbol package system, and you don't have access to the package in which the symbol is interned.
Now you might say, since objects are typically represented as pointers, aren't those a kind of integer anyway? Yes they are, but they are different type which doesn't support arithmetic; we can restrict programs from calculating arbitrary pointers, but we can't restrict programs from calculating arbitrary integers.
Even if we have escape hatches for converting an integer to a pointer, or other security/safety bypassng mechanisms, those escape hatches have an API which is identified by symbols, which we can exclude from the sandbox.
I (and GP given their use of capitalised Symbol) was talking about Ruby symbols, which for the most part are more equivalent to their object_id than to their textual representation.
What I mean by "integer" is not "pointer", it's "natural number", in the sense that there exists only one `1` vs you can have multiple textual "foo" strings.
So it's more useful to think of symbols as natural numbers which you don't know nor care the actual value of, because as a human you only care about the label you have attached to it, but you do care about its numberness property of it conceptually existing only once.
String can be interned in Ruby, but it’s always possible for an equal string that is not interned to also exist. Hence interned strings can’t benefit from the same optimization than symbols.
You can first compare them by pointer, but on a mismatch you have to fallback to comparing content. Same for the hashcode, you have to hash the content.
I'm talking about as a user of the language, not as a language designer. I have an unpopular opinion about this, but symbols are error prone and these optimizations are ultimately not worth it.
A Symbol is really just a string!
Well, it's a string that will guarantee unique allocations (two identical strings are guaranteed to be allocated at the same address), which makes equality checks super fast (compare pointers directly). But pretty much just a string nonetheless...