This is a decade-old discussion at this point, and one I've avoided participating in much because I've historically viewed the problem as untractable, and did not have much to contribute.
We want a syntax sugar construct that can interact with the variable binding scope, that visually feels like an identifier but also
looks like like an atom or a string, or is only used in contexts where the identifier type is clear, or have additional syntax to support both atoms and strings. There's just not a lot of room to play with a single colon, the second-smallest possible punctuation, that doesn't ultimately look close to a tuple, which is another commonly cited issue with these proposals. Working in a single double-quote syntactically can't work because they are matched and would risk breaking existing programs by passively parsing to the next string. Any syntax sugar addition
has to be a syntax error today, in all possible expressions—typespecs, function heads, etc, to ensure backwards compatibility of existing code. Sigils literals have to reproduce the full complicated parsing of the entire language to turn their strings into good AST for all edge cases, and the sigil-suffix-style clarification of atoms vs strings has never been very compelling to me. It's also unusual for sigil literals to modify variable scope, so I have never much cared for it. This is just scratching the surface of a decade of discussion!
However, I do have one new idea I haven't seen before I thought I'd bring up. The reply I linked above is in response to an interesting notion: introduce a new unary operator for this purpose. I never saw this idea get much play; the original post spitballs something like:
1. %{`foo, `bar}
2. %{~foo, ~bar}
3. %{$foo, $bar}
I realized while playing around in iex this morning:
- That we already have an existing unary operator that was not around for all of this historical discussion
- That is already a tricky power-user syntax sugar for manipulating variables in scope
- That emits a specific compile-time-error today if used on literal atoms, strings, and charlists
I am referring, of course, to the capture operator. While & is generally used as a unary operator to capture a function, within a capture we allow passing it a number to inject variables into the function scope, ex &2. What if we allowed & to capture atoms, strings, and charlists as well, only within %{} Map literals, to implement this feature? In a sense, "capturing" a variable from the environment for use in/extracting from a Map context, as well as a function context? ex:
```
foo = 1
bar = 2
baz = 3
map = %{ &:foo, &"bar", &'baz' }
#=> %{:foo => 1, "bar" => 2, 'baz' => 3}
```
Trying to use & in this way today yields:
```
** (CompileError) iex:3: invalid args for &, expected one of:
* &Mod.fun/arity to capture a remote function, such as &Enum.map/2
* &fun/arity to capture a local or imported function, such as &is_atom/1
* &some_code(&1, ...) containing at least one argument as &1, such as &List.flatten(&1)
Got: :foo
```
We could overload it further with cases for maps. I'm not sure if complicating this already often-confusing operator is a good idea, but it feels like a good place to me, as it cleanly supports the exact literals with the types you care about, it already can modify variable scope, is already more of a power-user syntax sugar feature, and we already are committed to answering questions about it and guiding newcomers to documentation concerning it. It even syntax highlights well already.
I suspect this syntax could even be leveraged within lists to build Keywords, as well, with some more thought.
I could even see syntax for explicit struct support, either via the existing assertive access syntax (&.field) or even bare-words identifier access (&field) to encourage this particular usage. With both a struct extension and Keyword extension, you could do something like:
```
[&.host, &.port] = URI.parse("postgresql://username:password@host:5432/database_name?schema=public")
#=> [host: "host", port: 5432]
```
There are still a lot of syntax edge-cases to think through here, though.