CVE-2023-29374

In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Configurations

Configuration 1 (hide)

cpe:2.3:a:langchain:langchain:*:*:*:*:*:*:*:*

History

No history.

Information

Published : 2023-04-05 02:15

Updated : 2024-02-28 20:13


NVD link : CVE-2023-29374

Mitre link : CVE-2023-29374

CVE.ORG link : CVE-2023-29374


JSON object : View

Products Affected

langchain

  • langchain
CWE
CWE-74

Improper Neutralization of Special Elements in Output Used by a Downstream Component ('Injection')