JPPF Issue Tracker
star_faded.png
Please log in to bookmark issues
bug_report_small.png
CLOSED  Bug report JPPF-500  -  Node Persistent Data Casting error
Posted Apr 19, 2017 - updated Apr 28, 2017
icon_info.png This issue has been closed with status "Not a bug" and resolution "NOT AN ISSUE".
Issue details
  • Type of issue
    Bug report
  • Status
     
    Not a bug
  • Assigned to
     lolo4j
  • Progress
       
  • Type of bug
    Not triaged
  • Likelihood
    Not triaged
  • Effect
    Not triaged
  • Posted by
     steve4j
  • Owned by
    Not owned by anyone
  • Category
    Node
  • Resolution
    NOT AN ISSUE
  • Priority
    High
  • Reproducability
    Always
  • Severity
    Normal
  • Targetted for
    icon_milestones.png JPPF 5.2.7
Issue description
Referring to the Snippet in http://www.jppf.org/forums/index.php?topic=748.0 i made up a Task wich is using HikariCP as ConnectionPool. When start the Client code everything is fine. But when i want to start it again i get following Output:

cd C:\Users\swende01\Documents\NetBeansProjects\JPPFDBm; "JAVA_HOME=C:\\Program Files\\Java\\jdk1.8.0_121" cmd /c "\"\"C:\\Program Files\\NetBeans 8.2\\java\\maven\\bin\\mvn.bat\" -Dexec.args=\"-Xmx64m -Dlog4j.configuration=log4j.properties -Djppf.config=jppf.properties -Djava.util.logging.config.file=config/logging.properties -classpath %classpath de.itout.jppf.test.jppfdbm.Runner\" -Dexec.executable=\"C:\\Program Files\\Java\\jdk1.8.0_121\\bin\\java.exe\" -Dmaven.ext.class.path=\"C:\\Program Files\\NetBeans 8.2\\java\\maven-nblib\\netbeans-eventspy.jar\" -Dfile.encoding=UTF-8 -Djava.net.useSystemProxies=true process-classes org.codehaus.mojo:exec-maven-plugin:1.2.1:exec\""
Scanning for projects...
 
------------------------------------------------------------------------
Building JPPFDBm 1.0-SNAPSHOT
------------------------------------------------------------------------
 
--- maven-resources-plugin:2.5:resources (default-resources) @ JPPFDBm ---
[debug] execute contextualize
Using 'UTF-8' encoding to copy filtered resources.
skip non existing resourceDirectory C:\Users\swende01\Documents\NetBeansProjects\JPPFDBm\src\main\resources
 
--- maven-compiler-plugin:2.3.2:compile (default-compile) @ JPPFDBm ---
Nothing to compile - all classes are up to date
 
--- exec-maven-plugin:1.2.1:exec (default-cli) @ JPPFDBm ---
log4j:WARN No appenders could be found for logger (org.jppf.utils.JPPFConfiguration).
log4j:WARN Please initialize the log4j system properly.
client process id: 1088, uuid: BDB1A448-26B2-738A-C31A-AF1B490F1FFE
[client: jppf_discovery-1-1 - ClassServer] Attempting connection to the class server at localhost:11111
[client: jppf_discovery-1-1 - ClassServer] Reconnected to the class server
[client: jppf_discovery-1-1 - TasksServer] Attempting connection to the task server at localhost:11111
[client: jppf_discovery-1-1 - TasksServer] Reconnected to the JPPF task server
[client: jppf_discovery-1-2 - ClassServer] Attempting connection to the class server at localhost:11111
[client: jppf_discovery-1-2 - ClassServer] Reconnected to the class server
[client: jppf_discovery-1-2 - TasksServer] Attempting connection to the task server at localhost:11111
[client: jppf_discovery-1-2 - TasksServer] Reconnected to the JPPF task server
[client: jppf_discovery-1-3 - ClassServer] Attempting connection to the class server at localhost:11111
[client: jppf_discovery-1-3 - ClassServer] Reconnected to the class server
[client: jppf_discovery-1-3 - TasksServer] Attempting connection to the task server at localhost:11111
[client: jppf_discovery-1-3 - TasksServer] Reconnected to the JPPF task server
[client: jppf_discovery-1-4 - ClassServer] Attempting connection to the class server at localhost:11111
[client: jppf_discovery-1-4 - ClassServer] Reconnected to the class server
[client: jppf_discovery-1-4 - TasksServer] Attempting connection to the task server at localhost:11111
[client: jppf_discovery-1-4 - TasksServer] Reconnected to the JPPF task server
[client: jppf_discovery-1-5 - ClassServer] Attempting connection to the class server at localhost:11111
[client: jppf_discovery-1-5 - ClassServer] Reconnected to the class server
[client: jppf_discovery-1-5 - TasksServer] Attempting connection to the task server at localhost:11111
[client: jppf_discovery-1-5 - TasksServer] Reconnected to the JPPF task server
Doing something while the jobs are executing ...
Results for job 'Template concurrent job 1' :
Template concurrent job 1 - DB task, an exception was raised: com.zaxxer.hikari.HikariDataSource cannot be cast to com.zaxxer.hikari.HikariDataSource
Results for job 'Template concurrent job 2' :
Template concurrent job 2 - DB task, an exception was raised: com.zaxxer.hikari.HikariDataSource cannot be cast to com.zaxxer.hikari.HikariDataSource
Results for job 'Template concurrent job 3' :
Template concurrent job 3 - DB task, an exception was raised: com.zaxxer.hikari.HikariDataSource cannot be cast to com.zaxxer.hikari.HikariDataSource
Results for job 'Template concurrent job 4' :
Template concurrent job 4 - DB task, an exception was raised: com.zaxxer.hikari.HikariDataSource cannot be cast to com.zaxxer.hikari.HikariDataSource
Results for job 'Template concurrent job 5' :
Template concurrent job 5 - DB task, an exception was raised: com.zaxxer.hikari.HikariDataSource cannot be cast to com.zaxxer.hikari.HikariDataSource
------------------------------------------------------------------------
BUILD SUCCESS
------------------------------------------------------------------------
Total time: 10.582s
Finished at: Wed Apr 19 10:02:18 CEST 2017
Final Memory: 5M/123M
------------------------------------------------------------------------


Maybe the Classloader of the Node gets the bytecode a second time and isnt able to cast the "old" stored object into the new pulled Class?

The main goal is to made up a connectionpool on a node and let the task on the node do something on the database with different clients.

=== The Database ===
CREATE TABLE JPPFTEST (
IDENT NUMERIC(8,0) NOT NULL,
RES VARCHAR(255),
 CONSTRAINT PK_JPPFTEST PRIMARY KEY 
(IDENT));


=== The code of the Task ===
package de.itout.jppf.test.jppfdbm;
 
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import java.io.UnsupportedEncodingException;
import java.net.InetAddress;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import org.jppf.node.protocol.AbstractTask;
import org.jppf.node.NodeRunner;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.logging.Level;
import java.util.logging.Logger;
 
/**
 *
 * @author swende01
 */
public class DBTask extends AbstractTask<String> {
 
    @Override
    public void run() {
        System.out.println("running task:" + this.getId());
        System.out.println("calling for datasource");
        HikariDataSource ds = getDataSource();
        System.out.println("datasource returned");
        if (ds == null) {
            System.out.println("datasource==null");
        }
        try {
            Connection conn = ds.getConnection();
            System.out.println("connection created");
            if (conn == null) {
                System.out.println("conn==null");
            }
            String res = calculateHash();
            Statement stmt = conn.createStatement();
            if (stmt == null) {
                System.out.println("stmt==null");
            }
            String host = InetAddress.getLocalHost().getHostName();
            System.out.println("host:" + host);
            String q = "INSERT INTO JPPFTEST VALUES ("+getNextID(conn)+",'" + res + "')";
            System.out.println(q);
            stmt.executeUpdate(q);
            stmt.close();
            stmt = null;
            conn.close();
            conn = null;
            setResult(res);
        } catch (Exception ex) {
            System.out.println(ex);
        }
    }
 
    private int getNextID(Connection con){
        Statement stmt = null;
        String query = "SELECT MAX(IDENT)+1 FROM JPPFTEST";
        try {
            stmt = con.createStatement();
            ResultSet rs = stmt.executeQuery(query);
            while (rs.next()) {
                int id = rs.getInt(1);
                return id;
            }
        } catch (SQLException e) {
            System.err.println(e);
        } finally {
            if (stmt != null) {
                try {
                    stmt.close();
                } catch (SQLException ex) {
                    System.err.println(ex);
                }
            }
        }
        return -1;
    }
 
    private String calculateHash() {
        System.out.println("Generate Random Numbers...");
        double a = Math.random();
        double b = Math.random();
        System.out.println("Random Numbers are A="+a+" and B="+b);
        MessageDigest md;
        String result = "";
        try {
            md = MessageDigest.getInstance("SHA-256");
            String text = a+""+b+"there is salt in the sea";
            System.out.println("Encrypt the two numbers with a salt ["+text+"]");
            md.update(text.getBytes("UTF-8")); 
            byte[] digest = md.digest();
            result = String.format("%064x", new java.math.BigInteger(1, digest));
            System.out.println("Encryted text is["+result+"]");
        } catch (NoSuchAlgorithmException | UnsupportedEncodingException ex) {
            System.err.println(ex);
        }
        return result;
    }
 
    protected static HikariDataSource setUpDataSource() {
        HikariConfig config = new HikariConfig();
        config.setJdbcUrl("SOMEJDBCURL");
        config.setUsername("user");
        config.setPassword("pw");
        config.addDataSourceProperty("cachePrepStmts", "true");
        config.addDataSourceProperty("prepStmtCacheSize", "250");
        config.addDataSourceProperty("prepStmtCacheSqlLimit", "2048");
 
        HikariDataSource dataSource = new HikariDataSource(config);
        NodeRunner.setPersistentData("datasource", dataSource);
        return dataSource;
    }
 
    public synchronized static HikariDataSource getDataSource() {
        System.out.println("returning dataSource");
        HikariDataSource ds = (HikariDataSource) NodeRunner.getPersistentData("datasource");
        if (ds == null) {
            System.out.println("setting up dataSource");
            ds = setUpDataSource();
        }
        return ds;
    }
}


=== The Client Runner Class ===
/*
 * To change this license header, choose License Headers in Project Properties.
 * To change this template file, choose Tools | Templates
 * and open the template in the editor.
 */
package de.itout.jppf.test.jppfdbm;
 
import java.util.ArrayList;
import java.util.List;
import org.jppf.client.*;
import org.jppf.node.protocol.Task;
 
/**
 *
 * @author swende01
 */
public class Runner {
 
    public static void main(String[] args) {
 
        // create the JPPFClient. This constructor call causes JPPF to read the configuration file
        // and connect with one or multiple JPPF drivers.
        try (JPPFClient jppfClient = new JPPFClient()) {
 
            // create a runner instance.
            Runner runner = new Runner();
 
            // create and execute a blocking job
//      runner.executeBlockingJob(jppfClient);
            // create and execute a non-blocking job
            //runner.executeNonBlockingJob(jppfClient);
            // create and execute 3 jobs concurrently
            runner.executeMultipleConcurrentJobs(jppfClient, 5);
 
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
 
    public void executeMultipleConcurrentJobs(final JPPFClient jppfClient, final int numberOfJobs) throws Exception {
        // ensure that the client connection pool has as many connections
        // as the number of jobs to execute
        ensureNumberOfConnections(jppfClient, numberOfJobs);
 
        // this list will hold all the jobs submitted for execution,
        // so we can later collect and process their results
        final List<JPPFJob> jobList = new ArrayList<>(numberOfJobs);
 
        // create and submit all the jobs
        for (int i = 1; i <= numberOfJobs; i++) {
            // create a job with a distinct name
            JPPFJob job = createJob("Template concurrent job " + i);
 
            // set the job in non-blocking (or asynchronous) mode.
            job.setBlocking(false);
 
            // submit the job for execution, without blocking the current thread
            jppfClient.submitJob(job);
 
            // add this job to the list
            jobList.add(job);
        }
 
        // the non-blocking jobs are submitted asynchronously, we can do anything else in the meantime
        System.out.println("Doing something while the jobs are executing ...");
 
        // wait until the jobs are finished and process their results.
        for (JPPFJob job : jobList) {
            // wait if necessary for the job to complete and collect its results
            List<Task<?>> results = job.awaitResults();
 
            // process the job results
            processExecutionResults(job.getName(), results);
        }
    }
 
    /**
     * Create a JPPF job that can be submitted for execution.
     *
     * @param jobName an arbitrary, human-readable name given to the job.
     * @return an instance of the {@link org.jppf.client.JPPFJob JPPFJob} class.
     * @throws Exception if an error occurs while creating the job or adding
     * tasks.
     */
    public JPPFJob createJob(final String jobName) throws Exception {
        // create a JPPF job
        JPPFJob job = new JPPFJob();
        // give this job a readable name that we can use to monitor and manage it.
        job.setName(jobName);
 
        // add a task to the job.
        Task<?> task = job.add(new DBTask());
        // provide a user-defined name for the task
        task.setId(jobName + " - DB task");
 
        // add more tasks here ...
        // there is no guarantee on the order of execution of the tasks,
        // however the results are guaranteed to be returned in the same order as the tasks.
        return job;
    }
 
    /**
     * Process the execution results of each submitted task.
     *
     * @param jobName the name of the job whose results are processed.
     * @param results the tasks results after execution on the grid.
     */
    public synchronized void processExecutionResults(final String jobName, final List<Task<?>> results) {
        // print a results header
        System.out.printf("Results for job '%s' :\n", jobName);
        // process the results
        for (Task<?> task : results) {
            String taskName = task.getId();
            // if the task execution resulted in an exception
            if (task.getThrowable() != null) {
                // process the exception here ...
                System.out.println(taskName + ", an exception was raised: " + task.getThrowable().getMessage());
            } else {
                // process the result here ...
                System.out.println(taskName + ", execution result: " + task.getResult());
            }
        }
    }
 
    /**
     * Ensure that the JPPF client has the desired number of connections.
     *
     * @param jppfClient the JPPF client which submits the jobs.
     * @param numberOfConnections the desired number of connections.
     * @throws Exception if any error occurs.
     */
    public void ensureNumberOfConnections(final JPPFClient jppfClient, final int numberOfConnections) throws Exception {
        // wait until the client has at least one connection pool with at least one avaialable connection
        JPPFConnectionPool pool = jppfClient.awaitActiveConnectionPool();
 
        // if the pool doesn't have the expected number of connections, change its size
        if (pool.getConnections().size() != numberOfConnections) {
            // set the pool size to the desired number of connections
            pool.setSize(numberOfConnections);
        }
 
        // wait until all desired connections are available (ACTIVE status)
        pool.awaitActiveConnections(Operator.AT_LEAST, numberOfConnections);
    }
}
Steps to reproduce this issue
  1. Make up the Database with the SQL
  2. Set up a Java Project with the above code and edit the DB Connection Properties
  3. Start a Driver and a Node
  4. Start the Project once
  5. Start the Priject a second time

#3
Comment posted by
 lolo4j
Apr 19, 18:50
The problem is indeed one of classes loaded by different class loaders. The node will create a separate class loader for each distinct client whose tasks it executes. Each of these class loaders will be able to load classes or resources from the corresponding client's classpath. The node also has a single "server" class loader, which loads resources from the driver's classpath, who also is the parent class loader for all "client" class loaders. To fix the issue, I recommend you deploy the HikariCP libraries in the driver's classpath (or in each node's classpath, but that's much more work). For more details, please read the documentation on class loading.

To conclude, what you observe is the expected behavior, unless I missed something?
#5
Comment posted by
 steve4j
icon_reply.pngApr 20, 17:54, in reply to comment #3
Hello Laurent,

as i added the HikariCP jar file to the drivers classpath everything is workin fine. Next step is to add Hibernate. Sadly there is no documentation about this in the offical jppf page.

So i added all the Hibernate jars to the driver, made up a new task class, entity and persistent xml. The first Client runs the code perfect. But the second time i got a similar error.

Entite  ⇑ top

package de.itout.jppf.test.entities;
 
import java.io.Serializable;
import javax.persistence.Basic;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
 
/**
 *
 * @author swende01
 */
@Entity
@Table(name = "JPPFTEST")
@NamedQueries(
{
  @NamedQuery(name = "Jppftest.findAll", query = "SELECT j FROM Jppftest j ORDER BY j.ident"),
  @NamedQuery(name = "Jppftest.nextnumber",query = "SELECT MAX(j.ident)+1 FROM Jppftest j")
})
public class Jppftest implements Serializable
{
 
  private static final long serialVersionUID = 1L;
  @Id
  @Basic(optional = false)
  @Column(name = "IDENT")
  private Integer ident;
  @Column(name = "RES")
  private String res;
 
  public Jppftest()
  {
  }
 
  public Jppftest(Integer ident)
  {
    this.ident = ident;
  }
 
  public Integer getIdent()
  {
    return ident;
  }
 
  public void setIdent(Integer ident)
  {
    this.ident = ident;
  }
 
  public String getRes()
  {
    return res;
  }
 
  public void setRes(String res)
  {
    this.res = res;
  }
 
  @Override
  public int hashCode()
  {
    int hash = 0;
    hash += (ident != null ? ident.hashCode() : 0);
    return hash;
  }
 
  @Override
  public boolean equals(Object object)
  {
    // TODO: Warning - this method won't work in the case the id fields are not set
    if (!(object instanceof Jppftest))
    {
      return false;
    }
    Jppftest other = (Jppftest) object;
    if ((this.ident == null && other.ident != null) || (this.ident != null && !this.ident.equals(other.ident)))
    {
      return false;
    }
    return true;
  }
 
  @Override
  public String toString()
  {
    return "de.hur.e1interfaces.tst.Jppftest[ ident=" + ident + ", res="+res+" ]";
  }
 
}


Task  ⇑ top

package de.itout.jppf.test.jppfdbm;
 
import de.itout.jppf.test.entities.Jppftest;
import java.io.UnsupportedEncodingException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import org.jppf.node.NodeRunner;
import org.jppf.node.protocol.AbstractTask;
 
/**
 *
 * @author swende01
 */
public class HibernateTask extends AbstractTask<String> {
 
    @Override
    public void run() {
        EntityManager em = getEm();
        try {
            String res = calculateHash();
            Jppftest jppftest = new Jppftest(getNextID(em));
            jppftest.setRes(res);
            em.getTransaction().begin();
            em.persist(jppftest);
            em.getTransaction().commit();
            setResult(res);
        } catch (Exception e) {
            System.err.println(e);
        } finally {
            em.close();
        }
    }
 
    private int getNextID(EntityManager em) {
        Integer i = (Integer) em.createNamedQuery("Jppftest.nextnumber").getSingleResult();
        if (i != null) {
            return i;
        }
        return -1;
    }
 
    private String calculateHash() {
        System.out.println("Generate Random Numbers...");
        double a = Math.random();
        double b = Math.random();
        System.out.println("Random Numbers are A=" + a + " and B=" + b);
        MessageDigest md;
        String result = "";
        try {
            md = MessageDigest.getInstance("SHA-256");
            String text = a + "" + b + "there is salt in the sea";
            System.out.println("Encrypt the two numbers with a salt [" + text + "]");
            md.update(text.getBytes("UTF-8"));
            byte[] digest = md.digest();
            result = String.format("%064x", new java.math.BigInteger(1, digest));
            System.out.println("Encryted text is[" + result + "]");
        } catch (NoSuchAlgorithmException | UnsupportedEncodingException ex) {
            System.err.println(ex);
        }
        return result;
    }
 
    protected static EntityManagerFactory setUpEmf() {
        EntityManagerFactory emf = Persistence.createEntityManagerFactory("TestPU");
        NodeRunner.setPersistentData("emf", emf);
        return emf;
    }
 
    public synchronized static EntityManager getEm() {
        System.out.println("returning emf");
        EntityManagerFactory emf = (EntityManagerFactory) NodeRunner.getPersistentData("emf");
        if (emf == null) {
            System.out.println("setting up emf");
            emf = setUpEmf();
        }
        return emf.createEntityManager();
    }
}


persistence.xml  ⇑ top

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
  <persistence-unit name="TestPU" transaction-type="RESOURCE_LOCAL">
    <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
    <class>de.itout.jppf.test.entities.Jppftest</class>
    <properties>
      <property name="javax.persistence.jdbc.driver" value="com.ibm.db2.jcc.DB2Driver "/>
      <property name="javax.persistence.jdbc.url" value="jdbc:db2://SomeServer"/>
      <property name="javax.persistence.jdbc.user" value="user"/>
      <property name="javax.persistence.jdbc.password" value="pw"/>
      <property name="hibernate.dialect" value="org.hibernate.dialect.DB2Dialect"/>
      <property name="hibernate.connection.provider_class" value="com.zaxxer.hikari.hibernate.HikariConnectionProvider"/>
      <property name="hibernate.hikari.minimumIdle" value="5"/>
      <property name="hibernate.hikari.maximumPoolSize" value="10"/>
      <property name="hibernate.hikari.idleTimeout" value="30000"/>
      <property name="hibernate.hikari.dataSource.url" value="jdbc:db2://Server"/>
      <property name="hibernate.hikari.dataSource.user" value="user"/>
      <property name="hibernate.hikari.dataSource.password" value="pw"/>
    </properties>
  </persistence-unit>
</persistence>


Node Output  ⇑ top

node process id: 9352, uuid: C98631DA-0062-860B-23EF-2C290DBFC4F3
Attempting connection to the class server at localhost:11111
RemoteClassLoaderConnection: Reconnected to the class server
JPPF Node management initialized on port 12001
Attempting connection to the node server at localhost:11111
Reconnected to the node server
Node successfully initialized
returning emf
setting up emf
Generate Random Numbers...
Random Numbers are A=0.8626820599363576 and B=0.21332831489095583
Encrypt the two numbers with a salt [0.86268205993635760.21332831489095583there is salt in the sea]
Encryted text is[d3d1a53163abe6519328adcb7a21ea02fa5eb328c89067501c9d143bdf2cb956]
returning emf
Generate Random Numbers...
Random Numbers are A=0.4010895783093096 and B=0.8660913618927831
Encrypt the two numbers with a salt [0.40108957830930960.8660913618927831there is salt in the sea]
Encryted text is[41b16c87ce99ea431ee514b9223b8ae9e5b7278cdaf50ff1573eaa94f35f6d09]
returning emf
Generate Random Numbers...
Random Numbers are A=0.03066768587841795 and B=0.5026272441207676
Encrypt the two numbers with a salt [0.030667685878417950.5026272441207676there is salt in the sea]
Encryted text is[c1b1037f49391779caac86044140ffd35ec5a7829db3c15ac3428f74e16ddcc0]
returning emf
Generate Random Numbers...
Random Numbers are A=0.8505007578350456 and B=0.15897579491449665
Encrypt the two numbers with a salt [0.85050075783504560.15897579491449665there is salt in the sea]
Encryted text is[3ab36cfd8f03288d061ae0d95e73423bcbace762e0230308f597f7406a151da8]
returning emf
Generate Random Numbers...
Random Numbers are A=0.856131562965201 and B=0.17364549263599716
Encrypt the two numbers with a salt [0.8561315629652010.17364549263599716there is salt in the sea]
Encryted text is[c9e251d75f1387c1335f8069eff2f5c93bc313ffdd5e3612ba9027a1d80341d7]
returning emf
Generate Random Numbers...
Random Numbers are A=0.32266038900057414 and B=0.40228294044984514
Encrypt the two numbers with a salt [0.322660389000574140.40228294044984514there is salt in the sea]
Encryted text is[f98ba7f808b879b413b67f07f1cbb16595ce232b5b4fb50ab1f7d2563448e3a9]
javax.persistence.PersistenceException: org.hibernate.property.access.spi.PropertyAccessException: Error accessing field [private java.lang.Integer de.itout.jppf.test.entities.Jppftest.ident] by reflection for persistent property [de.itout.jppf.test.entities.Jppftest#ident] : de.hur.e1interfaces.tst.Jppftest[ ident=38, res=f98ba7f808b879b413b67f07f1cbb16595ce232b5b4fb50ab1f7d2563448e3a9 ]
returning emf
Generate Random Numbers...
Random Numbers are A=0.7641286715172553 and B=0.8247163569751376
Encrypt the two numbers with a salt [0.76412867151725530.8247163569751376there is salt in the sea]
Encryted text is[54aff2737f58047831b30235a59c7df7dbaa75890dccaeb9a48a24119fa8bdfe]
javax.persistence.PersistenceException: org.hibernate.property.access.spi.PropertyAccessException: Error accessing field [private java.lang.Integer de.itout.jppf.test.entities.Jppftest.ident] by reflection for persistent property [de.itout.jppf.test.entities.Jppftest#ident] : de.hur.e1interfaces.tst.Jppftest[ ident=38, res=54aff2737f58047831b30235a59c7df7dbaa75890dccaeb9a48a24119fa8bdfe ]
returning emf
Generate Random Numbers...
Random Numbers are A=0.9131877107455606 and B=0.30565878052290973
Encrypt the two numbers with a salt [0.91318771074556060.30565878052290973there is salt in the sea]
Encryted text is[ee3000b3ce4c0a0c1909197293c1ff9cc0f5c7a39e08eac26c26e3a474a6c438]
javax.persistence.PersistenceException: org.hibernate.property.access.spi.PropertyAccessException: Error accessing field [private java.lang.Integer de.itout.jppf.test.entities.Jppftest.ident] by reflection for persistent property [de.itout.jppf.test.entities.Jppftest#ident] : de.hur.e1interfaces.tst.Jppftest[ ident=38, res=ee3000b3ce4c0a0c1909197293c1ff9cc0f5c7a39e08eac26c26e3a474a6c438 ]
returning emf
Generate Random Numbers...
Random Numbers are A=0.0482967653072921 and B=0.11444709169915035
Encrypt the two numbers with a salt [0.04829676530729210.11444709169915035there is salt in the sea]
Encryted text is[2b7224fc960e155fdf21ae93f3255ac87be91f9676f73e02710afb36d833f2df]
javax.persistence.PersistenceException: org.hibernate.property.access.spi.PropertyAccessException: Error accessing field [private java.lang.Integer de.itout.jppf.test.entities.Jppftest.ident] by reflection for persistent property [de.itout.jppf.test.entities.Jppftest#ident] : de.hur.e1interfaces.tst.Jppftest[ ident=38, res=2b7224fc960e155fdf21ae93f3255ac87be91f9676f73e02710afb36d833f2df ]
returning emf
Generate Random Numbers...
Random Numbers are A=0.4942331914967587 and B=0.41554934788576114
Encrypt the two numbers with a salt [0.49423319149675870.41554934788576114there is salt in the sea]
Encryted text is[156dd00dc06171d75b04756edfdf223f4bedc1bd0a5abe723380fb6e77308129]
javax.persistence.PersistenceException: org.hibernate.property.access.spi.PropertyAccessException: Error accessing field [private java.lang.Integer de.itout.jppf.test.entities.Jppftest.ident] by reflection for persistent property [de.itout.jppf.test.entities.Jppftest#ident] : de.hur.e1interfaces.tst.Jppftest[ ident=38, res=156dd00dc06171d75b04756edfdf223f4bedc1bd0a5abe723380fb6e77308129 ]


Mybe Hibernate stores the entitie class and the second client reload it. But i dont want to put my entities inside another jar and deliver it by hand to the lib-ext of the driver.

Best practice meybe would be that i can say persist a datasource / entitymanagerfactory for all clients with another node mothod, so that just one classloader is involved in this special case.

lolo4j wrote:
The problem is indeed one of classes loaded by different class loaders. The
node will create a separate class loader for each distinct client whose
tasks it executes. Each of these class loaders will be able to load classes
or resources from the corresponding client's classpath. The node also has a
single "server" class loader, which loads resources from the driver's
classpath, who also is the parent class loader for all "client" class
loaders. To fix the issue, I recommend you deploy the HikariCP libraries in
the driver's classpath (or in each node's classpath, but that's much more
work). For more details, please read the documentation on class loading.

To conclude, what you observe is the expected behavior, unless I missed
something?


#6
Comment posted by
 lolo4j
Apr 21, 09:11
Hello,

To try and understand how the entity class is loaded, I added this piece of code to the class Jppftest:
public class Jppftest implements Serializable {
  static {
    System.out.println("loading Jppftest.class from this call stack:\n" + ExceptionUtils.getCallStack());
  }
 
  ...
}
This provides the following output at first execution of the task:
loading Jppftest.class from this call stack:
  at test.hibernate.Jppftest.<clinit>(Jppftest.java:44)
  at java.lang.Class.forName0(Native Method)
  at java.lang.Class.forName(Class.java:278)
  at org.hibernate.boot.registry.classloading.internal.ClassLoaderServiceImpl.classForName(ClassLoaderServiceImpl.java:226)
  at org.hibernate.boot.model.source.internal.annotations.AnnotationMetadataSourceProcessorImpl.<init>(AnnotationMetadataSourceProcessorImpl.java:103)
  at org.hibernate.boot.model.process.spi.MetadataBuildingProcess$1.<init>(MetadataBuildingProcess.java:147)
  at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:141)
  at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:847)
  at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:874)
  at org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory(HibernatePersistenceProvider.java:58)
  at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:55)
  at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
  at test.hibernate.HibernateTask.setUpEmf(HibernateTask.java:84)
  at test.hibernate.HibernateTask.getEm(HibernateTask.java:94)
  at test.hibernate.HibernateTask.run(HibernateTask.java:37)
  at org.jppf.execute.NodeTaskWrapper.run(NodeTaskWrapper.java:158)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:745)
According to this, the entity class is actually loaded when the EntityManagrFactory is created, hence the problem.

So here there isn't much that can be done. I don't think evne customizing the Hibernate bootstraping will help.

Right now, I see only 2 ways to get around this problem from the JPPF perspective:
  1. put the entity classes in the driver's classpath. Drawback: it's an additional deployment burden
  2. use the same client UUID accross all JPPF clients, with the constructor new JPPFClient(String uuid). Drawback: when the code of the entity changes, the uuid has to change too or the classes will not be reloaded, and the uuid must remain synchronied accross all clients.
#7
Comment posted by
 steve4j
icon_reply.pngApr 21, 16:26, in reply to comment #6
Hallo Laurent,

thank you for your fast reply. I testet option 2. But as soon i change the entity and changed the uuid of the client. The single Node is waiting for the older Task with the old classes to be finish. I react on the class cast ex. by open a new emf. But the old connectionpool is still open cause i cant close the emf cause i cant cast it.

public synchronized static EntityManager getEm() {
        System.out.println("returning emf");
        EntityManagerFactory emf = null;
        try {
            emf = (EntityManagerFactory) NodeRunner.getPersistentData("emf");
        } catch (ClassCastException e) {
            System.err.println(e);
            emf = setUpEmf();
        }
        if (emf == null) {
            System.out.println("setting up emf");
            emf = setUpEmf();
        }
        return emf.createEntityManager();
    }


double connection pool MBeansicon_open_new.png

Threadsicon_open_new.png

Is there a way to close the emf when i make up a new one?

lolo4j wrote:
Hello,

To try and understand how the entity class is loaded, I added this piece of
code to the class Jppftest:
public class Jppftest implements Serializable {
static {
System.out.println("loading Jppftest.class from this call stack:\n" +
ExceptionUtils.getCallStack());
}
 
...
}
This provides the following output at first execution of the task:
loading Jppftest.class from this call stack:
at test.hibernate.Jppftest.<clinit>(Jppftest.java:44)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at
org.hibernate.boot.registry.classloading.internal.ClassLoaderServiceImpl.classForName(ClassLoaderServiceImpl.java:226)
at
org.hibernate.boot.model.source.internal.annotations.AnnotationMetadataSourceProcessorImpl.<init>(AnnotationMetadataSourceProcessorImpl.java:103)
at
org.hibernate.boot.model.process.spi.MetadataBuildingProcess$1.<init>(MetadataBuildingProcess.java:147)
at
org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:141)
at
org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:847)
at
org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:874)
at
org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory(HibernatePersistenceProvider.java:58)
at
javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:55)
at
javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
at test.hibernate.HibernateTask.setUpEmf(HibernateTask.java:84)
at test.hibernate.HibernateTask.getEm(HibernateTask.java:94)
at test.hibernate.HibernateTask.run(HibernateTask.java:37)
at org.jppf.execute.NodeTaskWrapper.run(NodeTaskWrapper.java:158)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
According to this, the entity class is actually loaded when the
EntityManagrFactory is created, hence the problem.

So here there isn't much that can be done. I don't think evne customizing
the Hibernate bootstraping will help.

Right now, I see only 2 ways to get around this problem from the JPPF
perspective:
put the entity classes in the driver's classpath. Drawback: it's an
additional deployment burden
use the same client UUID accross all JPPF clients, with the constructor
new JPPFClient(String uuid). Drawback: when the code of the entity changes,
the uuid has to change too or the classes will not be reloaded, and the
uuid must remain synchronied accross all clients.


#8
Comment posted by
 lolo4j
Apr 24, 09:28
Since the EntityManagerFactory is tied to the entity classes, then caching it is not the right solution. My understanding is that we actually want to cache the connection pool. Thus, one approach is to instantiate the connection pool independantly from JPA and then inject it when creating the EntityManagerFactory.

From what I've seen in the Hibernate documentation, it supports the "hibernate.connection.datasource" configuration property, which can reference either a JNDI name or a concrete DataSource implementation. This allows providing the datasource when creating the EntityManagerFactory via Persistence.createEntityManagerFactory(String, Map). According to this, I tested the following additions/changes in HibernateTask:
public class HibernateTask extends AbstractTask<String> {
  private static EntityManagerFactory emf; // re-created for each new client class loader
 
  ...
 
  public synchronized static EntityManager getEm() {
    System.out.println("returning emf");
    if (emf == null) {
      Map<String, Object> map = new HashMap<>();
      DataSource ds = getDataSource();
      map.put("hibernate.connection.datasource", ds);
      emf = Persistence.createEntityManagerFactory("TestPU", map);
    }
    return emf.createEntityManager();
  }
 
  private static DataSource getDataSource() {
    HikariDataSource ds = (HikariDataSource) NodeRunner.getPersistentData("ds");
    if (ds == null) {
      System.out.println("Datasource not found in cache, creating it");
      ds = new HikariDataSource();
      ds.setJdbcUrl("jdbc:mysql://localhost:3306/hibernate_test");
      ds.setUsername("jppftest");
      ds.setPassword("jppftest");
      ds.setMaximumPoolSize(10);
      ds.setMinimumIdle(5);
      NodeRunner.setPersistentData("ds", ds);
    } else {
      System.out.println("Datasource found in cache!");
    }
    return ds;
  }
}
This works like a charm for me.