欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

springboot多数据源配置mybatis采用druid连接池对mysql,hive双数据源整合

程序员文章站 2022-07-15 10:54:58
...
  • 介绍: 在项目开发的过程中我们很少出现单纯的只使用一个数据源,而一般情况我们在一个项目中会用都多个数据源,最近遇到的项目就需要从大数据的数据中拿部分数据,然后自己项目中的一些业务用mysql所以涉及到两个数据源
  •          本例子就讲解如何利用 springboot配置hive、mysql、mybatis 这样就不需要使用jdbc去连接方便灵活了很多

第一步:首先我们需要下载依赖:

maven依赖,包括mybatis,springboot,大数据连接,MySQL依赖,druid等

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spring.version>4.3.9.RELEASE</spring.version>
</properties>
 
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.9.RELEASE</version>
</parent>
 
<dependencies>
 
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-starter-web</artifactId>
</dependency>
 
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-core</artifactId>
</dependency>
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-context</artifactId>
</dependency>
 
 
<!-- 添加mybatis支持 -->
<dependency>
	<groupId>org.mybatis.spring.boot</groupId>
	<artifactId>mybatis-spring-boot-starter</artifactId>
	<version>1.3.2</version>
</dependency>
 
<!-- 添加mysql驱动 -->
<dependency>
	<groupId>mysql</groupId>
	<artifactId>mysql-connector-java</artifactId>
</dependency>
 
<!-- 添加数据库连接池 -->
<dependency>
	<groupId>com.alibaba</groupId>
	<artifactId>druid</artifactId>
	<version>1.0.29</version>
</dependency>
 
<!-- 添加spring管理bean对象 -->
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-beans</artifactId>
</dependency>
 
<!-- 添加hadoop依赖 -->
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-common</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-core</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-common</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-hdfs</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>jdk.tools</groupId>
	<artifactId>jdk.tools</artifactId>
	<version>1.8</version>
	<scope>system</scope>
	<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
 
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-configuration-processor</artifactId>
	<optional>true</optional>
</dependency>
 
<!-- 添加hive依赖 -->
<dependency>
	<groupId>org.apache.hive</groupId>
	<artifactId>hive-jdbc</artifactId>
	<version>1.1.0</version>
	<exclusions>
		<exclusion>
			<groupId>org.eclipse.jetty.aggregate</groupId>
			<artifactId>*</artifactId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>

    <groupId>
       org.apache.hive
    </groupId>
    <artifacId>hive-service</artifactId>
    <version>1.1.0</version>
</dependency>
 
</dependencies>	

第二步: 配置多数据源属性 在yml 或者 properties文件中都可以 我这里用的是yml:

spring:
  datasource:
    mysqlMain: # 数据源1mysql配置
      type: com.alibaba.druid.pool.DruidDataSource
      jdbc-url: jdbc:mysql://0.0.0.0:3306/heyufu?characterEncoding=UTF-8&useUnicode=true&serverTimezone=GMT%2B8
      username: root
      password: root
      driver-class-name: com.mysql.cj.jdbc.Driver
    hive: # 数据源2hive配置
      jdbc-url: jdbc:hive2://0.0.0.0:10000/iot
      username: hive
      password: hive
      driver-class-name: org.apache.hive.jdbc.HiveDriver
      type: com.alibaba.druid.pool.DruidDataSource
    

第三步:新建DataSourceProperties公共类

用于读取hive和mysql的数据源配置信息

import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;

import java.util.Map;


@Data
@ConfigurationProperties(prefix = DataSourceProperties.DS, ignoreUnknownFields = false)
public class DataSourceProperties {

    final static String DS = "spring.datasource";

    private Map<String,String> mysqlMain;

    private Map<String,String> hive;

    

}

第四步:设置双数据源配置类

MySQL配置类

import com.alibaba.druid.pool.DruidDataSource;
import com.xxxx.xxxx.Config.DataSourceCommonProperties;
import com.xxxx.xxxx.Config.DataSourceProperties;
import lombok.extern.log4j.Log4j2;
import org.apache.ibatis.session.SqlSessionFactory;
import org.mybatis.spring.SqlSessionFactoryBean;
import org.mybatis.spring.SqlSessionTemplate;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;

import javax.sql.DataSource;
import java.sql.SQLException;


@Configuration
//用来映射Dao层
@MapperScan(basePackages = "com.xxxx.xxxx.Dao.db1", sqlSessionFactoryRef = "db1SqlSessionFactory")
@Log4j2
@EnableConfigurationProperties({DataSourceProperties.class})
public class MysqlConfig {

    @Autowired
    private DataSourceProperties dataSourceProperties;

  
	// 设置为主数据源  
    @Primary
    @Bean("db1DataSource")
    public DataSource getDb1DataSource() thorw Exception {
        DruidDataSource datasource = new DruidDataSource();
        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getMysqlMain().get("jdbcUrl"));
        datasource.setUsername(dataSourceProperties.getMysqlMain().get("userName"));
        datasource.setPassword(dataSourceProperties.getMysqlMain().get("passWord"));
        datasource.setDriverClassName(dataSourceProperties.getMysqlMain().get("driverClassName"));

        return datasource;
    }
	
	// 创建工厂bean对象
    @Primary
    //唯一标识  在一个项目中的所有bean不能重复 
    @Bean("db1SqlSessionFactory")
    public SqlSessionFactory db1SqlSessionFactory(@Qualifier("db1DataSource") DataSource dataSource) throws Exception {
        SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
        bean.setDataSource(dataSource);
        // mapper的xml形式文件位置必须要配置,不然将报错:no statement (这种错误也可能是mapper的xml中,namespace与项目的路径不一致导致)
        bean.setMapperLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:Mapper/db1/*.xml"));
        return bean.getObject();
    }

	// 创建模板bean
    @Primary
    @Bean("db1SqlSessionTemplate")
    public SqlSessionTemplate db1SqlSessionTemplate(@Qualifier("db1SqlSessionFactory") SqlSessionFactory sqlSessionFactory){
        return new SqlSessionTemplate(sqlSessionFactory);
    }

}

Hive配置类

import com.alibaba.druid.pool.DruidDataSource;
import com.xxxx.xxxx.Config.DataSourceCommonProperties;
import com.xxxx.xxxx.Config.DataSourceProperties;
import lombok.extern.log4j.Log4j2;
import org.apache.ibatis.session.SqlSessionFactory;
import org.mybatis.spring.SqlSessionFactoryBean;
import org.mybatis.spring.SqlSessionTemplate;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;

import javax.sql.DataSource;
import java.sql.SQLException;


@Configuration
@MapperScan(basePackages = "com.xxxx.xxxx.Dao.db2", sqlSessionFactoryRef = "db2SqlSessionFactory")
@Log4j2
@EnableConfigurationProperties({DataSourceProperties.class})
public class HiveConfig {

    @Autowired
    private DataSourceProperties dataSourceProperties;

    @Bean("db2DataSource")
    public DataSource getDb2DataSource(){
        DruidDataSource datasource = new DruidDataSource();

        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getHive().get("jdbcUrl"));
        datasource.setUsername(dataSourceProperties.getHive().get("userName"));
        datasource.setPassWord(dataSourceProperties.getHive().get("passWord"));
        
    datasource.setDriverClassName(dataSourceProperties.getHive().get("driverClassName"));

        return datasource;
    }

    @Bean("db2SqlSessionFactory")
    public SqlSessionFactory db2SqlSessionFactory(@Qualifier("db2DataSource") DataSource dataSource) throws Exception {
        SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
        bean.setDataSource(dataSource);
        // mapper的xml形式文件位置必须要配置,不然将报错:no statement (这种错误也可能是mapper的xml中,namespace与项目的路径不一致导致)
        // 设置mapper.xml路径,classpath不能有空格
        bean.setMapperLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:Mapper/db2/*.xml"));
        return bean.getObject();
    }

    @Bean("db2SqlSessionTemplate")
    public SqlSessionTemplate db2SqlSessionTemplate(@Qualifier("db2SqlSessionFactory") SqlSessionFactory sqlSessionFactory){
        return new SqlSessionTemplate(sqlSessionFactory);
    }

}

注: @primary标识的是主数据源,整个项目只能有一个否则会报错

建立Mapper的接口:这两个文件就是@MapperScan扫描进的包

springboot多数据源配置mybatis采用druid连接池对mysql,hive双数据源整合

 

@Mapper
public interface HiveMapper{

        String selectList();
}
@Mapper
public interface MysqlMapper{

        String selectList();
}

maper.xml文件如下所示:

springboot多数据源配置mybatis采用druid连接池对mysql,hive双数据源整合

HiveMapper.xml 

<mapper namespace="Dao.db1.HiveMapper" resultType="string">
       select count(*) from 表名
</maper>
<mapper namespace="Dao.db1.MybatisMapper" resultType="string">
       select count(*) from 表名
</maper>

 

建立Service层:

public interface HiveService{
    String selectList();
}
public interface MysqlService{
    String selectList();
}

ServiceImpl 实现类:

public class HiveServiceImpl implements HiveService{
    @Resource
    HiveMapper hiveMapper

    public String selectList(){
        return hiveMapper.selectList();
    }
}
public class MysqlServiceImpl implements MysqlService{
    @Resource
    MysqlMapper mysqlMapper

    public String selectList(){
        return mysqlMapper.selectList();
    }
}

测试类:

@Resource
HiveServie hiveService
@Resource
MysqlService mysqlService

@Test
public void Test(){
    String a=hiveService.selectLit();
    System.out.println(a);
}

@Test
public void Test2(){
    String b=mysqlServcie.selectList();
    System.out.println(b);
}

最终大功告成!