Quantcast
Channel: Scala - Spark - Neo4j - Delete nodes - Stack Overflow
Viewing all articles
Browse latest Browse all 2

Scala - Spark - Neo4j - Delete nodes

$
0
0

I have a scala/spark application that interacts with Neo4j.

I'm able to add/update nodes (starting from a Dataset) but I would like to delete all the nodes in advance before starting to insert them.

I would like to submit the following query:

MATCH (n) DETACH DELETE n;

I tried with this code without success:

def removeDbObjects(s: SparkSession) = {   val emptyDf = s.emptyDataFrame   emptyDf.write    .format("org.neo4j.spark.DataSource")    .mode(SaveMode.Overwrite)    .option("query", "MATCH (n) DETACH DELETE n;")    .save()}

I have no errors but the nodes are not removed.

I'm using

<dependency><groupId>org.neo4j</groupId><artifactId>neo4j-connector-apache-spark_2.12</artifactId><version>5.0.3_for_spark_3</version></dependency>

Any help would be appreciated.

Thanks in advance.


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images