如何使用 Web 插件将 JSON 有效负载发送到 RabbitMQ?

How to send JSON payload to RabbitMQ using the web plugin?(如何使用 Web 插件将 JSON 有效负载发送到 RabbitMQ?)
本文介绍了如何使用 Web 插件将 JSON 有效负载发送到 RabbitMQ?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

问题描述

我有一个安装了 Web 管理插件的

我在接收端收到以下错误:

operation_id = payload['operationId']TypeError:字符串索引必须是整数

我尝试添加 content-type 标头和属性,但没有成功.

由于阅读器代码相同,表示网络发送方没有将发送的消息标记为JSON/字典载荷,因此在另一端读取为字符串.

知道如何使用 RabbitMQ Web 控制台将消息标记为 JSON 消息吗?

解决方案

我不得不使用 content_type 而不是 content-type(下划线而不是连字符).

这是一个非常值得怀疑的设计决定,因为每个人都知道的标准是

I have a RabbitMQ 3.4.2 instance with a web management plugin installed.

When I push to the message {'operationId': 194} to the queue using Python's kombu queue package, the message is read on the other end as a dictionary.

However, when I send the message using the web console:

I get the following error on the receiving end:

operation_id = payload['operationId']
TypeError: string indices must be integers

I have tried adding a content-type header and property, with no success.

Since the reader code is the same, it means that the web sender does not mark the sent message as a JSON / dictionary payload, and therefore it is read as a string on the other end.

Any idea how to mark a message as a JSON message using the RabbitMQ web console?

解决方案

I had to use content_type instead of content-type (an underscore instead of a hyphen).

This is a pretty questionable design decision, because the standard everybody knows is content-type.

这篇关于如何使用 Web 插件将 JSON 有效负载发送到 RabbitMQ?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

相关文档推荐

groupby multiple coords along a single dimension in xarray(在xarray中按单个维度的多个坐标分组)
Group by and Sum in Pandas without losing columns(Pandas中的GROUP BY AND SUM不丢失列)
Group by + New Column + Grab value former row based on conditionals(GROUP BY+新列+基于条件的前一行抓取值)
Groupby and interpolate in Pandas(PANDA中的Groupby算法和插值算法)
Pandas - Group Rows based on a column and replace NaN with non-null values(PANAS-基于列对行进行分组,并将NaN替换为非空值)
Grouping pandas DataFrame by 10 minute intervals(按10分钟间隔对 pandas 数据帧进行分组)