UTF-16 to UTF-8 conversion in JavaScript(JavaScript 中的 UTF-16 到 UTF-8 转换)
问题描述
I have Base64 encoded data that is in UTF-16 I am trying to decode the data but most libraries only support UTF-8. I believe I have to drop the null bites but I am unsure how.
Currently I am using David Chambbers Polyfill for Base64, but I have also tried other libraries such as phpjs.org, none of which support UTF-16.
One thing to point out is on Chrome the atob method works with out problem, Firefox I get results described here, and in IE I am only returned the first character.
Any help is greatly appreciated
You want to decode UTF-16, not convert to UTF-8. Decoding means that the result is a string of abstract characters. Of course there is an internal encoding for strings as well, UTF-16 or UCS-2 in javascript, but that's an implementation detail.
With strings the goal is that you don't have to worry about encodings but just about manipulating characters "as they are". So you can write string methods that don't need to decode input at all. Of course there are many edge cases where this falls apart.
You cannot decode utf-16 just by removing nulls. I mean this will work fine for the first 256 code points of unicode, but you will get garbage when any of the other ~110000 characters in unicode are used. You cannot even get the most popular non-ASCII characters like em dash or any smart quotes working.
Also, looking at your example, it looks like UTF-16LE.
//Braindead decoder that assumes fully valid input
function decodeUTF16LE( binaryStr ) {
var cp = [];
for( var i = 0; i < binaryStr.length; i+=2) {
cp.push(
binaryStr.charCodeAt(i) |
( binaryStr.charCodeAt(i+1) << 8 )
);
}
return String.fromCharCode.apply( String, cp );
}
var base64decode = atob; //In chrome and firefox, atob is a native method available for base64 decoding
var base64 = "VABlAHMAdABpAG4AZwA";
var binaryStr = base64decode(base64);
var result = decodeUTF16LE(binaryStr);
Now you can even get smart quotes working:
var base64 = "HCBoAGUAbABsAG8AHSA="
var binaryStr = base64decode(base64);
var result = decodeUTF16LE(binaryStr);
//""hello""
这篇关于JavaScript 中的 UTF-16 到 UTF-8 转换的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:JavaScript 中的 UTF-16 到 UTF-8 转换
基础教程推荐
- 我可以在浏览器中与Babel一起使用ES模块,而不捆绑我的代码吗? 2022-01-01
- 直接将值设置为滑块 2022-01-01
- Electron 将 Node.js 和 Chromium 上下文结合起来意味着 2022-01-01
- Chart.js 在线性图表上拖动点 2022-01-01
- 如何使用TypeScrip将固定承诺数组中的项设置为可选 2022-01-01
- 自定义 XMLHttpRequest.prototype.open 2022-01-01
- 用于 Twitter 小部件宽度的 HTML/CSS 2022-01-01
- Vue 3 – <过渡>渲染不能动画的非元素根节点 2022-01-01
- 如何使用JIT在顺风css中使用布局变体? 2022-01-01
- html表格如何通过更改悬停边框来突出显示列? 2022-01-01