本文深入介绍了HBase的基础概念、环境搭建与配置、数据操作和管理实践,提供了详尽的HBase项目实战案例,包括数据建模、数据操作与查询等。此外,文章还涵盖了HBase的性能调优技巧和常见问题解决方案,旨在帮助读者全面掌握Hbase项目实战。
HBase是一个分布式的、开源的、可扩展的数据库,基于Hadoop存储计算框架之上,主要用于存储非结构化和半结构化的数据。它被设计用于提供对大型数据集的快速读写访问,通常与Hadoop一起使用以处理PB级别的数据。HBase最初由Facebook的Hadoop团队开发,后来在Apache项目下托管,成为Apache顶级项目之一。
HBase使用Hadoop的HDFS作为底层存储,这意味着HBase能够利用HDFS的分布式存储能力,支持大规模的数据存储和高并发访问。HBase的架构设计使得它能够在集群中轻松地扩展存储容量和处理能力,从而满足大数据应用的需求。
特性 | MySQL/PostgreSQL | HBase |
---|---|---|
数据存储方式 | 关系型数据库 | NoSQL分布式存储系统 |
数据模型 | 表、行、列 | 表、列族、列、行 |
数据一致性 | 强一致性 | 最终一致性(可配置) |
数据存储容量 | 受服务器存储限制 | 可扩展,支持PB级别 |
数据访问速度 | 高速查询 | 高速随机读写 |
适用场景 | 小到中型数据量处理 | 大数据应用场景 |
数据一致性模型 | 强一致性 | 最终一致性 |
数据版本控制 | 无 | 支持多版本查询 |
数据结构化程度 | 高 | 低(灵活存储) |
写入性能 | 中等 | 高 |
HBASE_HOME
指向解压后的HBase目录。export HBASE_HOME=/path/to/hbase export PATH=$PATH:$HBASE_HOME/bin
编辑conf/hbase-site.xml
文件进行基本配置。配置文件中的关键属性包括:
hbase.rootdir
:指定HBase存储数据和元数据的目录。hbase.zookeeper.quorum
:定义ZooKeeper的地址,用于协调HBase集群。hbase.cluster.distributed
:设置为true
表示分布式部署。<configuration> <property> <name>hbase.rootdir</name> <value>file:///path/to/hbase/data</value> </property> <property> <name>hbase.zookeeper.quorum</name> <value>localhost</value> </property> <property> <name>hbase.cluster.distributed</name> <value>true</value> </property> </configuration>
启动HBase:
$ hbase-daemon.sh start master
启动ZooKeeper:
$ ./zkServer.sh start
停止HBase:
$ hbase-daemon.sh stop master
停止ZooKeeper:
$ ./zkServer.sh stop
检查HBase是否启动成功,可以通过jps
命令查看进程是否启动:
$ jps
表是HBase中的主要数据结构,类似于关系数据库中的表。每个表由多个行组成,每行由一个行键(Row Key)唯一标识,并且可以包含多个列族和列。
列族是一组列的集合,数据以列族为单位存储。列族是物理存储的单位,每个列族的数据会存储在单独的文件中。每个列族的数据存储在一个或多个列中。
列是列族中的具体列,列名必须以列族为前缀。列是数据的实际存储单元,列的名称、类型和值可以在运行时动态创建。
行是表中的数据行,由行键标识。行键是一个字符串,可以包含任意有效的字节。HBase没有固定的行结构,行中的列可以动态添加。
插入数据到HBase表中,需要指定行键和列族列名及对应的值。
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), Bytes.toBytes("value1")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
查询数据通常使用Get
对象来指定需要查询的行和列族列名。
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Get get = new Get(Bytes.toBytes("row1")); get.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); Result result = table.get(get); byte[] value = result.getValue(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); System.out.println("Value: " + Bytes.toString(value)); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
更新数据与插入数据类似,但可以指定时间戳来更新特定版本的数据。
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), Bytes.toBytes("new value")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
删除数据可以删除特定的单元格,也可以删除整个行。
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Delete delete = new Delete(Bytes.toBytes("row1")); delete.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); table.delete(delete); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
hbase(main):001:0> create 'myTable', 'cf1' 0 row(s) in 1.2930 seconds hbase(main):002:0> put 'myTable', 'row1', 'cf1:col1', 'value1' 0 row(s) in 0.0350 seconds hbase(main):003:0> get 'myTable', 'row1' COLUMN CELL cf1:col1 timestamp=1234567890, value=value1 1 row(s) in 0.0220 seconds
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), Bytes.toBytes("value1")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
在设计HBase表结构时,需要考虑以下几个关键因素:
行键的设计原则包括:简短、单调递增、唯一、避免热点。
"row1" "202301010001" "20230101-0001"
列族的设计应考虑数据的访问模式和存储效率。
"cf1" "cf2"
列名以列族为前缀,便于管理和查询。
"cf1:col1" "cf1:col2"
版本控制可以根据业务需求启用或禁用。
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), 1234567890, Bytes.toBytes("value1")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), Bytes.toBytes("value1")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Get get = new Get(Bytes.toBytes("row1")); get.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); Result result = table.get(get); byte[] value = result.getValue(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); System.out.println("Value: " + Bytes.toString(value)); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("row1")); put.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1"), Bytes.toBytes("new value")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("myTable"); Table table = connection.getTable(tableName); Delete delete = new Delete(Bytes.toBytes("row1")); delete.addColumn(Bytes.toBytes("cf1"), Bytes.toBytes("col1")); table.delete(delete); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
假设你正在为一个电子商务平台开发一个商品信息管理系统。该系统需要存储大量商品信息,并支持高效的查询和更新操作。具体需求如下:
根据需求,设计如下数据模型:
product_info
product_id
product_details
inventory
product_details
name
(商品名称)description
(商品描述)price
(商品价格)inventory
quantity
(商品库存)import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import org.apache.hadoop.hbase.HColumnDescriptor; import org.apache.hadoop.hbase.HTableDescriptor; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); Admin admin = connection.getAdmin(); TableName tableName = TableName.valueOf("product_info"); if (admin.tableExists(tableName)) { admin.disableTable(tableName); admin.deleteTable(tableName); } HTableDescriptor descriptor = new HTableDescriptor(tableName); descriptor.addFamily(new HColumnDescriptor("product_details")); descriptor.addFamily(new HColumnDescriptor("inventory")); admin.createTable(descriptor); admin.close(); connection.close(); } catch (IOException e) { e.printStackTrace(); } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name"), Bytes.toBytes("Product 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description"), Bytes.toBytes("Description 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("10.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("100")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Get get = new Get(Bytes.toBytes("1001")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price")); get.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity")); Result result = table.get(get); byte[] name = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("name")); byte[] description = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("description")); byte[] price = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("price")); byte[] quantity = result.getValue(Bytes.toBytes("inventory"), Bytes.toBytes("quantity")); System.out.println("Name: " + Bytes.toString(name)); System.out.println("Description: " + Bytes.toString(description)); System.out.println("Price: " + Bytes.toString(price)); System.out.println("Quantity: " + Bytes.toString(quantity)); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("11.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("90")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Delete delete = new Delete(Bytes.toBytes("1001")); table.delete(delete); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
部署和测试步骤如下:
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name"), Bytes.toBytes("Product 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description"), Bytes.toBytes("Description 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("10.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("100")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Get get = new Get(Bytes.toBytes("1001")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description")); get.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price")); get.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity")); Result result = table.get(get); byte[] name = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("name")); byte[] description = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("description")); byte[] price = result.getValue(Bytes.toBytes("product_details"), Bytes.toBytes("price")); byte[] quantity = result.getValue(Bytes.toBytes("inventory"), Bytes.toBytes("quantity")); System.out.println("Name: " + Bytes.toString(name)); System.out.println("Description: " + Bytes.toString(description)); System.out.println("Price: " + Bytes.toString(price)); System.out.println("Quantity: " + Bytes.toString(quantity)); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("11.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("90")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Delete delete = new Delete(Bytes.toBytes("1001")); table.delete(delete); table.close(); } catch (IOException e) { e.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
TableNotFoundException
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name"), Bytes.toBytes("Product 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description"), Bytes.toBytes("Description 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("10.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("100")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); if (e.getMessage().contains("TableNotFoundException")) { System.out.println("Table not found."); } } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
TableNotEnabledException
import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class HBaseExample { public static void main(String[] args) { Configuration config = HBaseConfiguration.create(); Connection connection = null; try { connection = ConnectionFactory.createConnection(config); TableName tableName = TableName.valueOf("product_info"); Table table = connection.getTable(tableName); Put put = new Put(Bytes.toBytes("1001")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("name"), Bytes.toBytes("Product 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("description"), Bytes.toBytes("Description 1")); put.addColumn(Bytes.toBytes("product_details"), Bytes.toBytes("price"), Bytes.toBytes("10.99")); put.addColumn(Bytes.toBytes("inventory"), Bytes.toBytes("quantity"), Bytes.toBytes("100")); table.put(put); table.close(); } catch (IOException e) { e.printStackTrace(); if (e.getMessage().contains("TableNotEnabledException")) { System.out.println("Table is disabled."); } } finally { if (connection != null) { try { connection.close(); } catch (IOException e) { e.printStackTrace(); } } } } }
hbase(main):001:0> describe 'product_info'
hbase backup
命令进行备份和恢复。hbase.client.write.buffer
、hbase.regionserver.global.memstore.size
等,以优化性能。<property> <name>hbase.client.write.buffer</name> <value>10485760</value> </property> <property> <name>hbase.regionserver.global.memstore.size</name> <value>0.4</value> </property>
$ hbase org.apache.hadoop.hbase.tool.MetricsCollectorTool
$ cd /path/to/hbase $ ./bin/hbase org.apache.hadoop.hbase.tool.MetricsCollectorTool `` 以上是HBase从入门到初级应用教程的完整内容,涵盖了HBase的基础概念介绍、环境搭建与配置、数据操作和管理实践、项目实战案例以及常见问题与解决方案。希望这篇文章能帮助你更好地理解和使用HBase。