英文:
Converting customized XTEA algorithm from JavaScript to Golang
问题
我已经将自定义的XTEA加密算法从JavaScript代码转换为Golang,但是Golang的输出结果不正确,与JavaScript的输出结果不一致。以下是我的JavaScript源代码:
function sample(e, t) {
for (var n = 32, r = 0; 0 < n--; ) {
e[0] += (((e[1] << 4) ^ (e[1] >> 5)) + e[1]) ^ (r + t[3 & r]);
r += -1640531527;
e[1] += (((e[0] << 4) ^ (e[0] >> 5)) + e[0]) ^ (r + t[(r >> 11) & 3]);
}
}
var temp = [15, 16];
var temp_2 = [14, 15, 16, 17];
sample(temp, temp_2);
console.log(temp);
以及Golang源代码:
func sample(v *[2]uint32, key *[4]uint32) {
const (
num_rounds uint32 = 32
delta uint32 = 0x9E3779B9
)
for i, sum := uint32(0), uint32(0); i < num_rounds; i++ {
v[0] += (((v[1] << 4) ^ (v[1] >> 5)) + v[1]) ^ (sum + key[sum&3])
sum += delta
v[1] += (((v[0] << 4) ^ (v[0] >> 5)) + v[0]) ^ (sum + key[(sum>>11)&3])
}
}
我认为问题与黄金比例以及从JavaScript的64位浮点数系统转换有关,因为我不知道如何准确地进行转换。
英文:
I have currently converted the customized XTEA encryption from JavaScript code to Golang, but the Golang output is incorrect and not same as JavaScript output, here's my JavaScript source code:
function sample(e, t) {
for (var n = 32, r = 0; 0 < n--; ) {
e[0] += (((e[1] << 4) ^ (e[1] >> 5)) + e[1]) ^ (r + t[3 & r]);
r += -1640531527;
e[1] += (((e[0] << 4) ^ (e[0] >> 5)) + e[0]) ^ (r + t[(r >> 11) & 3]);
}
}
var temp = [15, 16];
var temp_2 = [14, 15, 16, 17];
sample(temp, temp_2);
console.log(temp);
and Golang source code:
func sample(v *[2]uint32, key *[4]uint32) {
const (
num_rounds uint32 = 32
delta uint32 = 0x9E3779B9
)
for i, sum := uint32(0), uint32(0); i < num_rounds; i++ {
v[0] += (((v[1] << 4) ^ (v[1] >> 5)) + v[1]) ^ (sum + key[sum&3])
sum += delta
v[1] += (((v[0] << 4) ^ (v[0] >> 5)) + v[0]) ^ (sum + key[(sum>>11)&3])
}
}
I think the problem is related to Golden Ratio and conversion from JavaScript 64-bit float system that I've not applied because I didn't know how to do that exactly
答案1
得分: 1
这是Go语言的实现:
package main
import (
"fmt"
)
func main() {
v := [2]int64{15, 16}
key := [4]int64{14, 15, 16, 17}
sample(&v, &key)
}
func sample(v *[2]int64, key *[4]int64) {
const (
num_rounds = 32
delta int64 = 1640531527
)
for i, sum := 0, int64(0); i < num_rounds; i++ {
temp := int32(v[1])
v[0] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[int32(sum)&3]))
sum -= delta
temp = int32(v[0])
v[1] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[(int32(sum)>>11)&3]))
}
fmt.Println(*v)
// Output: [6092213800 11162584543]
}
解释
JavaScript中整数的安全范围是-(2^53 - 1)
到2^53 - 1
(参见Number的整数范围)。而JavaScript实现中的一个棘手的部分是位运算符总是将操作数转换为32位整数(参见固定宽度数值转换)。
为了与JavaScript实现保持一致,数据类型应该是int64
(int32
或uint32
对于-(2^53 - 1)
到2^53 - 1
之间的数字空间不足够)。因此,这些变量应声明为int64
:
v
中的项key
中的项sum
delta
然后,在执行位运算之前,我们将每个操作数转换为int32
。
英文:
Here is the Go implementation:
package main
import (
"fmt"
)
func main() {
v := [2]int64{15, 16}
key := [4]int64{14, 15, 16, 17}
sample(&v, &key)
}
func sample(v *[2]int64, key *[4]int64) {
const (
num_rounds = 32
delta int64 = 1640531527
)
for i, sum := 0, int64(0); i < num_rounds; i++ {
temp := int32(v[1])
v[0] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[int32(sum)&3]))
sum -= delta
temp = int32(v[0])
v[1] += int64((((temp << 4) ^ (temp >> 5)) + temp) ^ int32(sum+key[(int32(sum)>>11)&3]))
}
fmt.Println(*v)
// Output: [6092213800 11162584543]
}
Explanation
The safe range of a JavaScript interger is between -(2^53 - 1)
and 2^53 - 1
(see Integer range for Number). And the tricky part in the JavaScript implementation is that bitwise operators always convert the operands to 32-bit integers (see Fixed-width number conversion).
To align with the JavaScript implementation, the data types should be int64
(int32
or uint32
does not have enough space for numbers between -(2^53 - 1)
and 2^53 - 1
). So these variables should be declared as int64
:
- items in
v
- items in
key
sum
delta
Then before we perform bitwise operations, we convert every operand to int32
.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论