Easier example with 2byte symbol 中 (\u4e2d) also does not work. And the example with earth sign I verified on external decoder - it is valid (I can't share link due to chat rules).
It is not valid in UTF-16...neither in UTF-8 which is why I asked if you are using the right encoding. Here I am not talking about the Unicode standard. I am talking about the C++ standard.
The problem is that the C++ standard specifies that \u accepts only code points and not code units. You are specifying a surrogate pair using \u specification which is specifically forbidden by the C++ standard. Unless you show me how you are encoding the string (meaning you show me the code), there is not much I can tell you.
By default the values in a string are printed out assuming the local machine's encoding. Unless you specify the encoding correctly, the values printed out will not be meaningful. And also your terminal should be able to support the encoding used. Unless all of this happens, you will not be able to see the correct output that you expect.