Kamon spanId is reused for remote calls with Zipkin

44 views
Skip to first unread message

sjo...@crobox.com

unread,
Jul 4, 2018, 5:29:13 AM7/4/18
to kamon-user
Using the Kamon-ore 1.1.2, Kamon-zipkin 1.0.0

I'm investigating an issue we are having with Kamon tracing and Zipkin;

Following the recommended documentation we have set

kamon.trace.join-remote-parents-with-same-span-id = yes

in our configuration since we are using Zipkin.


But it seems that multiple servers are annotating / reusing the same spanId now in resulting in strange unwanted behaviour and having the following span in Zipkin.


Here is a JSON data snapshot of it:
{
"traceId": "67bbe400678e7434",
"id": "36d7ccc1701692da",
"name": "serviceactor: request",
"parentId": "461c6abaafca62c2",
"timestamp": 1530693333239282,
"duration": 92,
"annotations": [
{
"timestamp": 1530693333235463,
"value": "akka.actor.dequeued",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"timestamp": 1530693333237837,
"value": "akka.actor.dequeued",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"timestamp": 1530693333239317,
"value": "akka.actor.dequeued",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
}
],
"binaryAnnotations": [
{
"key": "akka.actor.class",
"value": "my.ServiceActor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"key": "akka.actor.class",
"value": "my.ShardingActor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"key": "akka.actor.class",
"value": "my.ShardingActor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
},
{
"key": "akka.actor.message-class",
"value": "Request",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"key": "akka.actor.message-class",
"value": "Select",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"key": "akka.actor.message-class",
"value": "Receive",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
},
{
"key": "akka.actor.path",
"value": "application/user/router",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"key": "akka.actor.path",
"value": "application/system/sharding/X/28/DsUCvnNOMOOYUiRQuQqcAw",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"key": "akka.actor.path",
"value": "application/system/sharding/X/35/2l8k0lT2MFmfLwfLf-mrMQ",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
},
{
"key": "akka.system",
"value": "application",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"key": "akka.system",
"value": "application",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"key": "akka.system",
"value": "application",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
},
{
"key": "component",
"value": "akka.actor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.15.2"
}
},
{
"key": "component",
"value": "akka.actor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.7.4"
}
},
{
"key": "component",
"value": "akka.actor",
"endpoint": {
"serviceName": "api",
"ipv4": "172.16.16.6"
}
}
]
}
```

Ivan Topolnjak

unread,
Jul 4, 2018, 5:36:42 AM7/4/18
to kamon-user
Hey there!

Noted. Please let's track this issue via https://github.com/kamon-io/kamon-akka-remote/issues/15, thanks!
Reply all
Reply to author
Forward
0 new messages