CVE |
Vendors |
Products |
Updated |
CVSS v3.1 |
Java object deserialization issue in Jackrabbit webapp/standalone on all platforms allows attacker to remotely execute code via RMIVersions up to (including) 2.20.10 (stable branch) and 2.21.17 (unstable branch) use the component "commons-beanutils", which contains a class that can be used for remote code execution over RMI.
Users are advised to immediately update to versions 2.20.11 or 2.21.18. Note that earlier stable branches (1.0.x .. 2.18.x) have been EOLd already and do not receive updates anymore.
In general, RMI support can expose vulnerabilities by the mere presence of an exploitable class on the classpath. Even if Jackrabbit itself does not contain any code known to be exploitable anymore, adding other components to your server can expose the same type of problem. We therefore recommend to disable RMI access altogether (see further below), and will discuss deprecating RMI support in future Jackrabbit releases.
How to check whether RMI support is enabledRMI support can be over an RMI-specific TCP port, and over an HTTP binding. Both are by default enabled in Jackrabbit webapp/standalone.
The native RMI protocol by default uses port 1099. To check whether it is enabled, tools like "netstat" can be used to check.
RMI-over-HTTP in Jackrabbit by default uses the path "/rmi". So when running standalone on port 8080, check whether an HTTP GET request on localhost:8080/rmi returns 404 (not enabled) or 200 (enabled). Note that the HTTP path may be different when the webapp is deployed in a container as non-root context, in which case the prefix is under the user's control.
Turning off RMIFind web.xml (either in JAR/WAR file or in unpacked web application folder), and remove the declaration and the mapping definition for the RemoteBindingServlet:
<servlet>
<servlet-name>RMI</servlet-name>
<servlet-class>org.apache.jackrabbit.servlet.remote.RemoteBindingServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>RMI</servlet-name>
<url-pattern>/rmi</url-pattern>
</servlet-mapping>
Find the bootstrap.properties file (in $REPOSITORY_HOME), and set
rmi.enabled=false
and also remove
rmi.host
rmi.port
rmi.url-pattern
If there is no file named bootstrap.properties in $REPOSITORY_HOME, it is located somewhere in the classpath. In this case, place a copy in $REPOSITORY_HOME and modify it as explained. |
Improper Input Validation vulnerability in Apache Software Foundation Apache Airflow Apache Hive Provider.
Patching on top of CVE-2023-35797
Before 6.1.2 the proxy_user option can also inject semicolon.
This issue affects Apache Airflow Apache Hive Provider: before 6.1.2.
It is recommended updating provider version to 6.1.2 in order to avoid this vulnerability. |
Apache Airflow, in versions prior to 2.7.0, contains a security vulnerability that can be exploited by an authenticated user possessing Connection edit privileges. This vulnerability allows the user to access connection information and exploit the test connection feature by sending many requests, leading to a denial of service (DoS) condition on the server. Furthermore, malicious actors can leverage this vulnerability to establish harmful connections with the server.
Users of Apache Airflow are strongly advised to upgrade to version 2.7.0 or newer to mitigate the risk associated with this vulnerability. Additionally, administrators are encouraged to review and adjust user permissions to restrict access to sensitive functionalities, reducing the attack surface. |
Apache NiFi 0.0.2 through 1.22.0 include Processors and Controller Services that support HTTP URL references for retrieving drivers, which allows an authenticated and authorized user to configure a location that enables custom code execution. The resolution introduces a new Required Permission for referencing remote resources, restricting configuration of these components to privileged users. The permission prevents unprivileged users from configuring Processors and Controller Services annotated with the new Reference Remote Resources restriction. Upgrading to Apache NiFi 1.23.0 is the recommended mitigation. |
Improper Input Validation vulnerability in Apache Software Foundation Apache Airflow Hive Provider.
This issue affects Apache Airflow Apache Hive Provider: before 6.1.1.
Before version 6.1.1 it was possible to bypass the security check to RCE via
principal parameter. For this to be exploited it requires access to modifying the connection details.
It is recommended updating provider version to 6.1.1 in order to avoid this vulnerability. |
Improper Neutralization of Special Elements Used in an SQL Command ('SQL Injection') vulnerability in Apache Software Foundation Apache InLong.This issue affects Apache InLong: from 1.4.0 through 1.7.0.
In the toAuditCkSql method, the groupId, streamId, auditId, and dt are directly concatenated into the SQL query statement, which may lead to SQL injection attacks.
Users are advised to upgrade to Apache InLong's 1.8.0 or cherry-pick [1] to solve it.
[1] https://github.com/apache/inlong/pull/8198 |
Apache Shiro, before 1.12.0 or 2.0.0-alpha-3, may be susceptible to a path traversal attack that results in an authentication bypass when used together with APIs or other web frameworks that route requests based on non-normalized requests.
Mitigation: Update to Apache Shiro 1.12.0+ or 2.0.0-alpha-3+ |
The DBCPConnectionPool and HikariCPConnectionPool Controller Services in Apache NiFi 0.0.2 through 1.21.0 allow an authenticated and authorized user to configure a Database URL with the H2 driver that enables custom code execution.
The resolution validates the Database URL and rejects H2 JDBC locations.
You are recommended to upgrade to version 1.22.0 or later which fixes this issue. |
Deserialization of Untrusted Data Vulnerability in Apache Software Foundation Apache InLong.This issue affects Apache InLong: from 1.4.0 through 1.7.0.
The attacker could bypass the current logic and achieve arbitrary file reading. To solve it, users are advised to upgrade to Apache InLong's 1.8.0 or cherry-pick https://github.com/apache/inlong/pull/8130 . |
Improper Input Validation vulnerability in Apache Software Foundation Apache Traffic Server.This issue affects Apache Traffic Server: through 9.2.1. |
Exposure of Sensitive Information to an Unauthorized Actor vulnerability in Apache Software Foundation Apache Traffic Server.This issue affects Apache Traffic Server: from 8.0.0 through 9.2.0.
8.x users should upgrade to 8.1.7 or later versions
9.x users should upgrade to 9.2.1 or later versions |
** UNSUPPORTED WHEN ASSIGNED ** The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission check function that will ultimately build a Unix shell command based on their input, and execute it. This will result in arbitrary shell command execution as the user Spark is currently running as. This issue was disclosed earlier as CVE-2022-33891, but incorrectly claimed version 3.1.3 (which has since gone EOL) would not be affected.
NOTE: This vulnerability only affects products that are no longer supported by the maintainer.
Users are recommended to upgrade to a supported version of Apache Spark, such as version 3.4.0. |
Incorrect Authorization vulnerability in Apache Software Foundation Apache IoTDB.This issue affects the iotdb-web-workbench component on 0.13.3. iotdb-web-workbench is an optional component of IoTDB, providing a web console of the database.
This problem is fixed from version 0.13.4 of iotdb-web-workbench onwards. |
Improper Input Validation vulnerability in Apache Software Foundation Apache Traffic Server. The configuration option proxy.config.http.push_method_enabled didn't function. However, by default the PUSH method is blocked in the ip_allow configuration file.This issue affects Apache Traffic Server: from 8.0.0 through 9.2.0.
8.x users should upgrade to 8.1.7 or later versions
9.x users should upgrade to 9.2.1 or later versions |
A deserialization vulnerability existed when decode a malicious package.This issue affects Apache Dubbo: from 3.1.0 through 3.1.10, from 3.2.0 through 3.2.4.
Users are recommended to upgrade to the latest version, which fixes the issue. |
In Apache Linkis <=1.3.1, because the parameters are not
effectively filtered, the attacker uses the MySQL data source and malicious parameters to
configure a new data source to trigger a deserialization vulnerability, eventually leading to
remote code execution.
Versions of Apache Linkis <= 1.3.0 will be affected.
We recommend users upgrade the version of Linkis to version 1.3.2. |
In Apache Linkis <=1.3.1, due to the lack of effective filtering
of parameters, an attacker configuring malicious Mysql JDBC parameters in JDBC EengineConn Module will trigger a
deserialization vulnerability and eventually lead to remote code execution. Therefore, the parameters in the Mysql JDBC URL should be blacklisted. Versions of Apache Linkis <= 1.3.0 will be affected.
We recommend users upgrade the version of Linkis to version 1.3.2. |
The fix for CVE-2023-24998 was incomplete for Apache Tomcat 11.0.0-M2 to 11.0.0-M4, 10.1.5 to 10.1.7, 9.0.71 to 9.0.73 and 8.5.85 to 8.5.87. If non-default HTTP connector settings were used such that the maxParameterCount could be reached using query string parameters and a request was submitted that supplied exactly maxParameterCount parameters in the query string, the limit for uploaded request parts could be bypassed with the potential for a denial of service to occur. |
Improper Input Validation vulnerability in Apache Software Foundation Apache Airflow Drill Provider.This issue affects Apache Airflow Drill Provider: before 2.3.2. |
In Apache Linkis <=1.3.1, The PublicService module uploads files without restrictions on the path to the uploaded files, and file types.
We recommend users upgrade the version of Linkis to version 1.3.2.
For versions
<=1.3.1, we suggest turning on the file path check switch in linkis.properties
`wds.linkis.workspace.filesystem.owner.check=true`
`wds.linkis.workspace.filesystem.path.check=true` |