Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码)

这篇具有很好参考价值的文章主要介绍了Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码)。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

📚第一章 前言

在前面对data-integration做了一些简单了解,从部署到应用,今天尝试把后端运行作业代码拎出来,去真正运行一下,只有实操之后才会有更深刻的认识,有些看着简单的功能,实操过程中会遇到很多问题,这个时候你的想法也会发生改变,所以很多时候为什么开发人员痛恨做产品的同事,因为嘴上说的简单,真正去做一下,你自己感受一下,就知道了!

  • 基于Kettle开发的web版数据集成开源工具(data-integration)-部署篇
  • 基于Kettle开发的web版数据集成开源工具(data-integration)-介绍篇
  • 基于Kettle开发的web版数据集成开源工具(data-integration)-应用篇

data-integration可以理解为Kettleweb版本,详情可参照上面的文章,这里直接从代码层面开始

📚第二章 demo源码

直接在我之前个人项目上进行的韧小钊(纯个人项目,整理或者更新全靠一时兴起,如果对你有用,挺好,没用请PASS),现在想法就是后端实现运行作业的功能,至于保存作业,都是前端传给后台的,后台存的就是JSON串,所以后端改动应该不大,这里就是尝试先提取后端运行作业代码,直接运行前端传参,看能否成功
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&ELT,Kettle Local,Engine,本地引擎运行,Kettle web

📗pom.xml引入Kettle引擎核心文件

引入之后,启动项目报jar包冲突,则在下面文件中进行exclusion配置排除

<!-- Kettle核心包 20240106 by rxz-->
<dependency>
	<groupId>pentaho-kettle</groupId>
	<artifactId>kettle-engine</artifactId>
	<version>${kettle.version}</version>
	<exclusions>
		<exclusion>
			<artifactId>servlet-api</artifactId>
			<groupId>javax.servlet</groupId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>
	<groupId>pentaho-kettle</groupId>
	<artifactId>kettle-core</artifactId>
	<version>${kettle.version}</version>
	<exclusions>
		<exclusion>
			<artifactId>servlet-api</artifactId>
			<groupId>javax.servlet</groupId>
		</exclusion>
	</exclusions>
</dependency>

📗java源码

📕 controller

package com.renxiaozhao.api.controller;

import com.renxiaozhao.common.Result;
import com.renxiaozhao.service.inf.PdiUseDemoService;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

/**
 * kettle/pdi使用样例-控制层
 *
 * @author 韧小钊
 */
@RestController
@RequestMapping("/kettle")
@Api(tags = "KETTLE-PDI")
@Slf4j
public class PdiUseDemoController {
    /**
     * 服务对象
     */
    @Autowired
    private PdiUseDemoService pdiUseDemoService;
    
    @PostMapping("/run")
    @ApiOperation(value = "kettle作业运行", notes = "kettle作业运行")
    public Result<Void> executeJob(@RequestParam String jobJson) throws Exception {
        pdiUseDemoService.executeJob(jobJson);
        return Result.success();
    }

}


📕 service

这里就是对应基于Kettle开发的web版数据集成开源工具(data-integration)-应用篇中的后台服务流程章节ProjectExecutorController.executeById,目的是研究Kettle引擎的具体使用,所以这里是去掉了各种封装,直接前端传参(作业JSON串),后端直接执行

package com.renxiaozhao.service.inf;

import com.baomidou.mybatisplus.extension.service.IService;
import com.renxiaozhao.bean.entity.SportEntity;

public interface PdiUseDemoService extends IService<SportEntity> {
//直接复制的原来代码,有些实体类暂时用不到,请先忽略
    void executeJob(String jobJson) throws Exception;
}

package com.renxiaozhao.service.impl;

import com.baomidou.mybatisplus.extension.service.impl.ServiceImpl;
import com.renxiaozhao.bean.entity.SportEntity;
import com.renxiaozhao.dao.mapper.SportMapper;
import com.renxiaozhao.service.inf.PdiUseDemoService;
import com.renxiaozhao.service.util.JSONLinkedObject;
import com.renxiaozhao.service.util.XML;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.io.FileUtils;
import org.apache.commons.lang.StringEscapeUtils;
import org.pentaho.di.core.exception.KettleMissingPluginsException;
import org.pentaho.di.core.exception.KettleXMLException;
import org.pentaho.di.core.logging.LogLevel;
import org.pentaho.di.core.logging.LoggingObjectType;
import org.pentaho.di.core.logging.SimpleLoggingObject;
import org.pentaho.di.core.variables.Variables;
import org.pentaho.di.core.xml.XMLHandler;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransAdapter;
import org.pentaho.di.trans.TransExecutionConfiguration;
import org.pentaho.di.trans.TransMeta;
import org.springframework.stereotype.Service;
import org.w3c.dom.Document;

import java.io.File;
import java.io.IOException;
import java.util.HashMap;
import java.util.UUID;

@Slf4j
@Service
public class PdiUseDemoServiceImpl extends ServiceImpl<SportMapper,SportEntity> implements PdiUseDemoService {

    @Override
    public void executeJob(String jobJson) throws Exception {
        execute(jobJson);
    }
    private void execute(String jobJson) throws Exception {
        // 构建TransMeta 对象
        TransMeta transMeta = buildTransMeta(jobJson);
        TransExecutionConfiguration executionConfiguration = new TransExecutionConfiguration();
        // 设置默认值以便运行配置可以正确设置
        executionConfiguration.setExecutingLocally(true);
        executionConfiguration.setExecutingRemotely(false);
        executionConfiguration.setExecutingClustered(false);
        // 不启用安全模式
        executionConfiguration.setSafeModeEnabled(true);
        executionConfiguration.getUsedVariables(transMeta);
        executionConfiguration.setLogLevel(LogLevel.DEBUG);
        // 默认设置本地引擎执行
        executionConfiguration.setRunConfiguration("Pentaho local");
        //设置命令参数
        executionConfiguration.setVariables(new HashMap<>());
        //  创建trans
        Trans trans = new Trans(transMeta);
        String spoonLogObjectId = UUID.randomUUID().toString();
        SimpleLoggingObject spoonLoggingObject = new SimpleLoggingObject(Thread.currentThread().getName() + "-" + Thread.currentThread().getId()
                , LoggingObjectType.SPOON, null);
        spoonLoggingObject.setContainerObjectId(spoonLogObjectId);
        spoonLoggingObject.setLogLevel(executionConfiguration.getLogLevel());
        trans.setParent(spoonLoggingObject);
        trans.setLogLevel(executionConfiguration.getLogLevel());
        trans.setReplayDate(executionConfiguration.getReplayDate());
        trans.setRepository(executionConfiguration.getRepository());
        trans.setMonitored(false);
        // 启动转换
        trans.addTransListener(new TransAdapter() {
            @Override
            public void transFinished(Trans trans) {
                log.info("项目执行完成");
            }
        });
        trans.startThreads();

    }
    public TransMeta buildTransMeta(String jobJson) throws IOException, KettleXMLException, KettleMissingPluginsException {
        Document document;
        //json转xml
        if (!jobJson.startsWith("<?xml")) {
            // json转xml
            jobJson = StringEscapeUtils.unescapeXml(jobJson);
            jobJson = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + XML.toString(new JSONLinkedObject(jobJson));

            log.info("json转换成xml,转换后的xml:{}", jobJson);
        }

        // 写到临时目录
        File outFile = new File("D:\\tmp\\test", "test.xml");
        FileUtils.writeStringToFile(outFile, jobJson);

        // 加载xml
        document = XMLHandler.loadXMLString(jobJson);

        TransMeta transMeta = new TransMeta();
        transMeta.loadXML(
                document.getDocumentElement(), outFile.getPath(), null, null, true, new Variables(),
                (message, rememberText, rememberPropertyName) -> {
                    // Yes means: overwrite
                    return true;
                });

        if (transMeta.hasMissingPlugins()) {
            log.info("【{}】缺少执行插件。", jobJson);
        }

        return transMeta;
    }
}

📕 其它

Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
代码已上传
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

📕 maven settings.xml

<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.2.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.2.0 https://maven.apache.org/xsd/settings-1.2.0.xsd">

    <localRepository>D:\soft\apache-maven-3.8.5\apache-maven-3.8.5\repository</localRepository>
    <pluginGroups></pluginGroups>
    <proxies></proxies>
    <servers></servers>
    <mirrors>
        <mirror>
            <id>pentaho-public</id>
            <name>Pentaho Public Mirror</name>
            <url>https://repo.orl.eng.hitachivantara.com/artifactory/pnt-mvn/</url>
            <mirrorOf>*</mirrorOf>
        </mirror>
        <mirror>
            <id>aliyunmaven</id>
            <mirrorOf>*</mirrorOf>
            <name>阿里云公共仓库</name>
            <url>https://maven.aliyun.com/repository/public</url>
        </mirror>
    </mirrors>
    <profiles></profiles>

</settings>

📗测试

启动服务,通过swagger测试
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

📕 测试文件

传参对应的就是dp_portal_project_fileproject_file,详见基于Kettle开发的web版数据集成开源工具(data-integration)-应用篇
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
测试文件参考如下(直接使用的话,记得修改数据库信息):

{
    "transformation": {
        "attributes": "",
        "connection": [
            {
                "access": "Native",
                "attributes": {
                    "attribute": [
                        {
                            "attribute": "Y",
                            "code": "SUPPORTS_BOOLEAN_DATA_TYPE"
                        },
                        {
                            "attribute": "Y",
                            "code": "SUPPORTS_TIMESTAMP_DATA_TYPE"
                        },
                        {
                            "attribute": "N",
                            "code": "QUOTE_ALL_FIELDS"
                        },
                        {
                            "attribute": "N",
                            "code": "FORCE_IDENTIFIERS_TO_LOWERCASE"
                        },
                        {
                            "attribute": "N",
                            "code": "FORCE_IDENTIFIERS_TO_UPPERCASE"
                        },
                        {
                            "attribute": "Y",
                            "code": "PRESERVE_RESERVED_WORD_CASE"
                        },
                        {
                            "attribute": "N",
                            "code": "STRICT_NUMBER_38_INTERPRETATION"
                        },
                        {
                            "attribute": "",
                            "code": "PREFERRED_SCHEMA_NAME"
                        }
                    ]
                },
                "dataSourceId": "e2be98ed-9077-4802-8795-df45acb7b033",
                "dataSourceName": "dataintegration_db",
                "data_tablespace": "",
                "database": "dataintegration_db",
                "dsName": "dataintegration_db",
                "index_tablespace": "",
                "name": "e2be98ed-9077-4802-8795-df45acb7b033",
                "password": "Encrypted 2be98afc86ae49d94a40dab508cc2fd89",
                "port": "3306",
                "server": "192.168.17.10",
                "servername": "",
                "type": "MYSQL",
                "username": "stelladp"
            }
        ],
        "info": {
            "capture_step_performance": "N",
            "clusterschemas": "",
            "created_date": "2020/02/05 13:34:55.102",
            "created_user": "037de45a-e2ad-4d0a-943c-455db05161f4",
            "dependencies": "",
            "description": "",
            "directory": "/home/admin",
            "extended_description": "",
            "feedback_shown": "Y",
            "feedback_size": 50000,
            "is_key_private": "N",
            "key_for_session_key": "",
            "log": "",
            "maxdate": {
                "connection": "",
                "field": "",
                "maxdiff": 0,
                "offset": 0,
                "table": ""
            },
            "modified_date": "2020/02/19 17:08:25.150",
            "modified_user": "admin",
            "name": "执行SQL",
            "parameters": {
                "parameter": [
                ]
            },
            "partitionschemas": "",
            "shared_objects_file": "",
            "size_rowset": 10000,
            "slaveservers": "",
            "sleep_time_empty": 50,
            "sleep_time_full": 50,
            "step_performance_capturing_delay": 1000,
            "step_performance_capturing_size_limit": 100,
            "trans_type": "Normal",
            "trans_version": "",
            "unique_connections": "N",
            "using_thread_priorities": "Y"
        },
        "nodeList": {
            "flowInfo": {
                "Id": "fdf50d66dc7d",
                "Name": "我的流程",
                "Remark": ""
            },
            "lineList": [
            ],
            "nodeList": [
                {
                    "copy": 0,
                    "dataQuerySql": "",
                    "dataSourceId": "",
                    "datasourceId": "",
                    "errors": 0,
                    "icon": "iconfont icon-kongjian1",
                    "id": "ExecSQL6925c0883a9b",
                    "label": "执行SQL脚本",
                    "left": "341px",
                    "linesInput": 0,
                    "linesOutput": 0,
                    "linesRead": 0,
                    "linesRejected": 0,
                    "linesUpdated": 0,
                    "linesWritten": 0,
                    "pluginId": "ExecSQL",
                    "pluginOutput": "Y",
                    "priority": "-",
                    "projectError": 0,
                    "projectId": "",
                    "projectName": "",
                    "seconPluginType": "scripts",
                    "seconds": 0,
                    "speed": 0,
                    "status": false,
                    "statusDescription": "",
                    "stepExecutionStatu": "",
                    "stepExecutionStatus": "",
                    "stepMetaExecutionStatus": null,
                    "stepName": "",
                    "stepSql": "",
                    "top": "212px"
                }
            ],
            "statusListen": [
            ]
        },
        "notepads": "",
        "order": {
            "hop": [
            ]
        },
        "slave-step-copy-partition-distribution": "",
        "slave_transformation": "N",
        "step": [
            {
                "BindString": "",
                "Connectpool": false,
                "Excute_sql": "",
                "GUI": {
                    "draw": "Y",
                    "xloc": "160",
                    "yloc": "224"
                },
                "arguments": {
                    "argument": [
                    ]
                },
                "attributes": [
                ],
                "bool": true,
                "cluster": false,
                "cluster_schema": [
                ],
                "connecName": "",
                "connect_size": "",
                "connection": "e2be98ed-9077-4802-8795-df45acb7b033",
                "copies": "1",
                "custom_distribution": [
                ],
                "defalut_mode": "",
                "delete_field": "",
                "descripe": "The default catalog of connections created by this pool.",
                "description": [
                ],
                "distribute": "Y",
                "execute_each_row": "N",
                "fields": {
                    "field": [
                    ]
                },
                "free_space": "",
                "header": "N",
                "identifier": false,
                "identifier_captial": false,
                "identifier_lowercase": false,
                "initFlag": true,
                "insert_field": "",
                "name": "执行SQL脚本",
                "noempty": "N",
                "oldStepName": "执行SQL脚本",
                "outFields": [
                ],
                "partitioning": {
                    "method": "none",
                    "schema_name": [
                    ]
                },
                "preserve_case": true,
                "quoteString": "N",
                "read_field": "",
                "remotesteps": {
                    "input": "\n      ",
                    "output": "\n      "
                },
                "replace_variables": "N",
                "set_params": false,
                "single_statement": "Y",
                "sql": "select count(1) from dp_portal_role",
                "stoponempty": "N",
                "strict_number": false,
                "time_stamp": true,
                "type": "ExecSQL",
                "update_field": ""
            }
        ],
        "step_error_handling": {
            "error": [
            ]
        }
    }
}

📕 测试结果

报错了…😔详见问题二
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

⁉️问题记录

❓问题一:jar包冲突 - An attempt was made to call the method javax.servlet.ServletContext.setInitParameter(Ljava/lang/String;Ljava/lang/String;)Z but it does not exist. Its class, javax.servlet.ServletContext, is available from the following locations:

java.lang.BootstrapMethodError: java.lang.NoSuchMethodError: javax.servlet.ServletContext.setInitParameter(Ljava/lang/String;Ljava/lang/String;)Z
	at org.springframework.boot.web.servlet.server.AbstractServletWebServerFactory.lambda$mergeInitializers$0(AbstractServletWebServerFactory.java:253)
	at org.springframework.boot.web.embedded.jetty.ServletContextInitializerConfiguration.callInitializers(ServletContextInitializerConfiguration.java:66)
	at org.springframework.boot.web.embedded.jetty.ServletContextInitializerConfiguration.configure(ServletContextInitializerConfiguration.java:55)
	at org.eclipse.jetty.webapp.WebAppContext.configure(WebAppContext.java:517)
	at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1454)
	at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:852)
	at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:278)
	at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:545)
	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
	at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:138)
	at org.eclipse.jetty.server.Server.start(Server.java:415)
	at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:108)
	at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:113)
	at org.eclipse.jetty.server.Server.doStart(Server.java:382)
	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
	at org.springframework.boot.web.embedded.jetty.JettyWebServer.initialize(JettyWebServer.java:108)
	at org.springframework.boot.web.embedded.jetty.JettyWebServer.<init>(JettyWebServer.java:86)
	at org.springframework.boot.web.embedded.jetty.JettyServletWebServerFactory.getJettyWebServer(JettyServletWebServerFactory.java:410)
	at org.springframework.boot.web.embedded.jetty.JettyServletWebServerFactory.getWebServer(JettyServletWebServerFactory.java:153)
	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.createWebServer(ServletWebServerApplicationContext.java:181)
	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.onRefresh(ServletWebServerApplicationContext.java:154)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:543)
	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142)
	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775)
	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:316)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248)
	at com.renxiaozhao.api.ApiApplicationServer.main(ApiApplicationServer.java:21)
Caused by: java.lang.NoSuchMethodError: javax.servlet.ServletContext.setInitParameter(Ljava/lang/String;Ljava/lang/String;)Z
	at java.lang.invoke.MethodHandleNatives.resolve(Native Method)
	at java.lang.invoke.MemberName$Factory.resolve(MemberName.java:975)
	at java.lang.invoke.MemberName$Factory.resolveOrFail(MemberName.java:1000)
	at java.lang.invoke.MethodHandles$Lookup.resolveOrFail(MethodHandles.java:1394)
	at java.lang.invoke.MethodHandles$Lookup.linkMethodHandleConstant(MethodHandles.java:1750)
	at java.lang.invoke.MethodHandleNatives.linkMethodHandleConstant(MethodHandleNatives.java:477)
	... 29 common frames omitted
2024-01-06 13:40:50.233 [main] INFO  org.eclipse.jetty.server.AbstractConnector -
				Started ServerConnector@697a0948{HTTP/1.1,[http/1.1]}{0.0.0.0:9080}
2024-01-06 13:40:50.242 [main] INFO  org.eclipse.jetty.server.AbstractConnector -
				Stopped ServerConnector@697a0948{HTTP/1.1,[http/1.1]}{0.0.0.0:9080}
2024-01-06 13:40:50.243 [main] INFO  org.eclipse.jetty.server.handler.ContextHandler -
				Stopped o.s.b.w.e.j.JettyEmbeddedWebAppContext@521441d5{application,/renxiaozhao,[file:///C:/Users/lenovo/AppData/Local/Temp/jetty-docbase.1734690716899826200.9080/, jar:file:/D:/soft/apache-maven-3.8.5/apache-maven-3.8.5/repository/io/springfox/springfox-swagger-ui/2.9.2/springfox-swagger-ui-2.9.2.jar!/META-INF/resources, jar:file:/D:/soft/apache-maven-3.8.5/apache-maven-3.8.5/repository/com/github/xiaoymin/knife4j-spring-ui/2.0.8/knife4j-spring-ui-2.0.8.jar!/META-INF/resources],UNAVAILABLE}
2024-01-06 13:40:50.247 [main] WARN  o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext -
				Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Unable to start web server; nested exception is org.springframework.boot.web.server.WebServerException: Unable to start embedded Jetty web server
2024-01-06 13:40:50.261 [main] INFO  o.s.b.a.l.ConditionEvaluationReportLoggingListener -
				

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2024-01-06 13:40:50.287 [main] ERROR o.s.b.diagnostics.LoggingFailureAnalysisReporter -
				

***************************
APPLICATION FAILED TO START
***************************

Description:

An attempt was made to call the method javax.servlet.ServletContext.setInitParameter(Ljava/lang/String;Ljava/lang/String;)Z but it does not exist. Its class, javax.servlet.ServletContext, is available from the following locations:

    jar:file:/D:/soft/apache-maven-3.8.5/apache-maven-3.8.5/repository/javax/servlet/servlet-api/2.4/servlet-api-2.4.jar!/javax/servlet/ServletContext.class
    jar:file:/D:/soft/apache-maven-3.8.5/apache-maven-3.8.5/repository/javax/servlet/javax.servlet-api/4.0.1/javax.servlet-api-4.0.1.jar!/javax/servlet/ServletContext.class

It was loaded from the following location:

    file:/D:/soft/apache-maven-3.8.5/apache-maven-3.8.5/repository/javax/servlet/servlet-api/2.4/servlet-api-2.4.jar


Action:

Correct the classpath of your application so that it contains a single, compatible version of javax.servlet.ServletContext

Disconnected from the target VM, address: '127.0.0.1:50771', transport: 'socket'

Process finished with exit code 1

❗解决方式一:Kettle排除掉冲突jar(因为没引入Kettle之前是没问题的)

Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

<kettle.version>9.2.0.0-290</kettle.version>
<dependency>
	<groupId>pentaho-kettle</groupId>
	<artifactId>kettle-engine</artifactId>
	<version>${kettle.version}</version>
	<exclusions>
		<exclusion>
			<artifactId>servlet-api</artifactId>
			<groupId>javax.servlet</groupId>
		</exclusion>
	</exclusions>
</dependency>

<dependency>
	<groupId>pentaho-kettle</groupId>
	<artifactId>kettle-core</artifactId>
	<version>${kettle.version}</version>
	<exclusions>
		<exclusion>
			<artifactId>servlet-api</artifactId>
			<groupId>javax.servlet</groupId>
		</exclusion>
	</exclusions>
</dependency>

Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

❕拓展:通过Maven Helper插件排除冲突jar包

使用IDEA工具,安装Maven Helper插件,进行冲突jar包排查,步骤如下:

  • 安装插件
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
  • 安装完成之后,打开pom.xml文件,可以看到Dependency Analyzer
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
  • 点击Dependency Analyzer进行分析
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
  • 尽量定位修改的位置进行排查(找到修改的pom.xml进行定位)
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
    冲突嘛(排除哪一方都行,根据报错排除对应的jar包,不行再试另一个jar包)
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
  • 直接点击Exclude,帮你自动生成排除配置
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
    Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

❓问题二:Database type not found!…database type with plugin id [Oracle] couldn't be found!

2024-01-05 16:54:41.190 [qtp1963906615-73] WARN  org.eclipse.jetty.server.HttpChannel -
				/renxiaozhao/kettle/run
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.pentaho.di.core.exception.KettleXMLException: 
错误从XML文件读取转换
Database type not found!

	at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1013)
	at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:908)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:665)
	at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:867)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
	at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
	at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
	at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1588)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1557)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
	at org.eclipse.jetty.server.Server.handle(Server.java:502)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.pentaho.di.core.exception.KettleXMLException: 
错误从XML文件读取转换
Database type not found!

	at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:3460)
	at com.renxiaozhao.service.impl.PdiUseDemoServiceImpl.buildTransMeta(PdiUseDemoServiceImpl.java:96)
	at com.renxiaozhao.service.impl.PdiUseDemoServiceImpl.execute(PdiUseDemoServiceImpl.java:41)
	at com.renxiaozhao.service.impl.PdiUseDemoServiceImpl.executeJob(PdiUseDemoServiceImpl.java:37)
	at com.renxiaozhao.service.impl.PdiUseDemoServiceImpl$$FastClassBySpringCGLIB$$56121f70.invoke(<generated>)
	at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
	at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:684)
	at com.renxiaozhao.service.impl.PdiUseDemoServiceImpl$$EnhancerBySpringCGLIB$$b8de3a8b.executeJob(<generated>)
	at com.renxiaozhao.api.controller.PdiUseDemoController.executeJob(PdiUseDemoController.java:33)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:189)
	at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138)
	at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:102)
	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895)
	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:800)
	at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)
	at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1038)
	at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
	at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
	... 47 common frames omitted
Caused by: java.lang.RuntimeException: Database type not found!
	at org.pentaho.di.core.database.DatabaseMeta.setValues(DatabaseMeta.java:642)
	at org.pentaho.di.core.database.DatabaseMeta.setDefault(DatabaseMeta.java:525)
	at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:516)
	at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:986)
	at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:3030)
	... 68 common frames omitted
Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
database type with plugin id [Oracle] couldn't be found!

	at org.pentaho.di.core.database.DatabaseMeta.findDatabaseInterface(DatabaseMeta.java:592)
	at org.pentaho.di.core.database.DatabaseMeta.getDatabaseInterface(DatabaseMeta.java:566)
	at org.pentaho.di.core.database.DatabaseMeta.setValues(DatabaseMeta.java:640)
	... 72 common frames omitted

❗解决思路一:该插件类型不存在,需要手动制作

检查对应配置文件kettle-database-types.xml,发现配置是存在的,所以这里报错不是类型不存在所致

  <database-type id="ORACLE">
    <description>Oracle</description>
    <classname>org.pentaho.di.core.database.OracleDatabaseMeta</classname>
  </database-type>   

Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

❗解决思路二:确保json转化的xml文件正确

Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web
Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码),ETL&amp;ELT,Kettle Local,Engine,本地引擎运行,Kettle web

❗解决思路三:正确加载kettle

现在怀疑就是kettle没有正确加载,但是如何排查,还是一头雾水,有知道的大佬欢迎告知,接下来再排查半小时,没结果就放弃了,大周六的,不要太卷!


问题解决已更新:Kettle Local引擎使用记录(二):问题记录及解决文章来源地址https://www.toymoban.com/news/detail-782292.html

到了这里,关于Kettle Local引擎使用记录(一)(基于Kettle web版数据集成开源工具data-integration源码)的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • 虚幻引擎集成web前端<一>:win环境UE4.27导出像素流并集成到vue2环境(附案例)

     本案例附件:https://download.csdn.net/download/rexfow/88303544 -AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888 1、执行run_local.bat: SamplesPixelStreamingWebServersSignallingWebServerplatform_scriptscmd F:UEpackageUE4271WindowsNoEditorSamplesPixelStreamingWebServersSignallingWebServerplatform_scriptscmd 运行后

    2024年02月09日
    浏览(40)
  • 使用kettle进行数据清洗

    申明: 未经许可,禁止以任何形式转载,若要引用,请标注链接地址 全文共计2175字,阅读大概需要3分钟 本实验任务主要完成基于ubuntu环境的使用kettle进行数据清洗的工作。通过完成本实验任务,要求学生熟练掌握使用kettle进行数据清洗的方法,为后续实验的开展奠定ETL平

    2023年04月20日
    浏览(37)
  • 2023年JAVA集成调用Kettle示例

    最近要弄一个java调用kettle的代码,查找网上的例子有很多的jar包下不下来,弄下来了各种报错,花了一点时间趟平了坑。临近新年,最后祝各位新年快乐! 依赖的jar包以pom的形式引入,有诸多版本,如果与kettle的版本不匹配则会调用失败。因为在java代码里会初始化插件来执

    2024年02月03日
    浏览(45)
  • 基于WebAssembly构建Web端音视频通话引擎

    Web技术在发展,音视频通话需求在演进,怎么去实现新的Web技术点在实际应用中的值,以及给我们带来更大的收益是需要我们去探索和实践的。LiveVideoStackCon 2022北京站邀请到田建华为我们从实践中来介绍WebAssembly、WebCodecs、WebTransport等技术在音视频行业的价值以及优势。 文

    2024年02月11日
    浏览(50)
  • 流数据湖平台Apache Paimon(四)集成 Hive 引擎

    前面与Flink集成时,通过使用 paimon Hive Catalog,可以从 Flink 创建、删除、查询和插入到 paimon 表中。这些操作直接影响相应的Hive元存储。以这种方式创建的表也可以直接从 Hive 访问。 更进一步的与 Hive 集成,可以使用 Hive SQL创建、查询Paimon表。 Paimon 目前支持 Hive 3.1、2.3、2

    2024年02月14日
    浏览(59)
  • 流数据湖平台Apache Paimon(二)集成 Flink 引擎

    Paimon目前支持Flink 1.17, 1.16, 1.15 和 1.14。本课程使用Flink 1.17.0。 环境准备 2.1.1 安装 Flink 1)上传并解压Flink安装包 tar -zxvf flink-1.17.0-bin-scala_2.12.tgz -C /opt/module/ 2)配置环境变量 2.1.2 上传 jar 包 1)下载并上传Paimon的jar包 jar包下载地址:https://repository.apache.org/snapshots/org/apache/pa

    2024年02月09日
    浏览(47)
  • 如何高效实现搜索引擎爬虫进行数据挖掘-搜索引擎爬虫(SERP)集成测试与分享

    身处大数据时代中,我们面对海量的互联网数据,如何自动高效地获取感兴趣的信息并为我们所用是一个非常重要的问题,以下就针对这个重要的搜索引擎爬虫问题来做一个技术分享。 什么是SERP和搜索引擎爬虫:搜索引擎会根据特定的的策略,运用特定的计算机程序搜集互

    2024年02月11日
    浏览(58)
  • 火山引擎DataTester:A/B实验平台数据集成技术分享

    DataTester的数据集成系统,可大幅降低企业接入A/B实验平台门槛。   当企业想要接入一套A/B实验平台的时候,常常会遇到这样的问题: 企业已经有一套埋点系统了,增加A/B实验平台的话需要重复做一遍埋点,费时费力; 企业有多个客户端和数据中台并行的情况,这些不同来源

    2024年02月04日
    浏览(37)
  • kettle web 版本 (webspoon) 中文部署 kettle 页面编辑 kettleweb 中文

    github 地址 : https://github.com/HiromuHota/pentaho-kettle 安装命令 可以看到已经装好了 页面访问,可以看到现在是英文的 进入 webspoon 容器 执行命令1 执行命令2 执行命令3 执行命令4 执行命令5 再次编辑 setenv.sh 在文件末尾追加这两项配置 保存 重启成功再次访问地址 汉化成功

    2024年02月11日
    浏览(50)
  • Web端3D轻量化引擎基于PBR渲染——仿真模拟更逼真

    HOOPS Communicator在2021版本中,推出了基于PBR(Physically Based Rendering)的渲染特性以提供更高质量的渲染技术。 PBR将材料表示为一系列方程,这些方程对光如何从表面反射进行建模,再通过GPU上运行的着色器代码进行有效地实现。 一、工程领域可视化问题停滞严重 在过去的30年

    2024年02月08日
    浏览(49)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包