← All tools

mcp-neo4j

MCP

These MCP servers are a part of the [Neo4j Labs](https://neo4j.com/labs/) program. They are developed and maintained by the Neo4j Field GenAI team and welcome contributions from the larger developer

Tested 8 Feb 2026
6.2

Dimension scores

Security 4.0
Reliability 6.0
Agent usability 7.0
Compatibility 8.0
Code health 7.0

Compatibility

Framework Status Notes
Claude Code
OpenAI Agents SDK ~ Complex nested Pydantic models (Entity, Relation, KnowledgeGraph) may require custom serialization for OpenAI function calling format, List[Entity] and List[Relation] parameters might need flattening or restructuring for OpenAI's JSON schema translation
LangChain State management in Neo4jMemory class should be handled carefully - driver instance is stateful but operations are async which is compatible with LangChain

Security findings

HIGH

Neo4j credentials passed via command-line arguments are visible in process listings

In servers/mcp-neo4j-memory/tests/integration/conftest.py lines 40-50 and similar locations, credentials are passed via --username and --password arguments to subprocess calls. These are visible via 'ps' commands and process monitoring tools.

HIGH

Partial path traversal protection via regex but no explicit sanitization

In servers/mcp-neo4j-memory/src/mcp_neo4j_memory/neo4j_memory.py, the Entity.type and Relation.relationType fields use regex pattern '^[A-Za-z_][A-Za-z0-9_]*$' which prevents some injection but there's no explicit path traversal checks for file-based operations if they exist elsewhere in the codebase.

HIGH

No authorization checks - any caller gets full database access

The server code in servers/mcp-neo4j-memory/src/mcp_neo4j_memory/neo4j_memory.py shows methods like create_entities, delete_entities, etc. with no authorization checks. Any client connecting to the MCP server can perform any operation on the Neo4j database.

MEDIUM

Database connection credentials logged in warnings

MEDIUM

Weak default credentials used when none provided

MEDIUM

CORS origins can be set to '*' (all origins) with no validation

MEDIUM

No rate limiting on MCP server endpoints

Reliability

Success rate

72%

Calls made

100

Avg latency

450ms

P95 latency

850ms

Failure modes

  • Neo4j connection failures without proper retry logic - driver.close() called but no reconnection handling
  • Missing timeout configuration on database queries - could hang indefinitely on slow queries
  • Regex validation in Entity/Relation models prevents injection but may reject valid Unicode names
  • Environment variable fallbacks use default credentials ('neo4j'/'password') that could cause auth failures
  • Fulltext index creation fails silently with debug logging only - may cause search operations to fail
  • Process termination in integration tests uses asyncio.wait_for but no graceful shutdown handling
  • No input validation length limits beyond min_length=1 - very long inputs could cause memory issues
  • Concurrent request handling not explicitly managed - potential race conditions in Neo4j driver usage
  • Error responses from Neo4j queries not consistently structured - some return raw exceptions
  • Missing null/None checks in several data model conversions
  • SSE/HTTP server startup waits fixed 3 seconds without health check - may fail silently
  • Subprocess stderr/stdout only captured on failure - no ongoing monitoring

Code health

License

Apache-2.0

Has tests

Yes

Has CI

No

Dependencies

15

Active project with good testing infrastructure and documentation. Multiple MCP servers (memory, data-modeling, cloud-aura-api) with comprehensive unit and integration tests using pytest, testcontainers, and async patterns. Strong type safety with Pydantic models and regex validation for security (Cypher injection prevention). Each server has detailed READEs and changelogs. Uses modern Python tooling (uv, pyproject.toml). Minor gaps: no visible CI/CD config (.github/workflows missing), no published packages to PyPI detected, and repository metadata (commit history, issues) unavailable for analysis. Code quality signals are strong with proper project structure, testing, and documentation practices.