• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java HiveChar类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.hive.common.type.HiveChar的典型用法代码示例。如果您正苦于以下问题:Java HiveChar类的具体用法?Java HiveChar怎么用?Java HiveChar使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



HiveChar类属于org.apache.hadoop.hive.common.type包,在下文中一共展示了HiveChar类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getTransformedWritable

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Override
public Object getTransformedWritable(DeferredObject object) throws HiveException {
  HiveChar value = columnType.getPrimitiveJavaObject(object.get());

  if(value != null) {
    String transformedValue = transformer.transform(value.getValue());

    if(transformedValue != null) {
      writable.set(transformedValue);

      return writable;
    }
  }

  return null;
}
 
开发者ID:myui,项目名称:hive-udf-backports,代码行数:17,代码来源:BaseMaskUDF.java


示例2: convertClobType

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
private Object convertClobType(Object val, HCatFieldSchema hfs) {
  HCatFieldSchema.Type hfsType = hfs.getType();
  ClobRef cr = (ClobRef) val;
  String s = cr.isExternal() ? cr.toString() : cr.getData();

  if (hfsType == HCatFieldSchema.Type.STRING) {
    return s;
  } else if (hfsType == HCatFieldSchema.Type.VARCHAR) {
    VarcharTypeInfo vti = (VarcharTypeInfo) hfs.getTypeInfo();
    HiveVarchar hvc = new HiveVarchar(s, vti.getLength());
    return hvc;
  } else if (hfsType == HCatFieldSchema.Type.CHAR) {
    CharTypeInfo cti = (CharTypeInfo) hfs.getTypeInfo();
    HiveChar hc = new HiveChar(s, cti.getLength());
    return hc;
  }
  return null;
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:19,代码来源:SqoopHCatImportHelper.java


示例3: testStringTypes

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
public void testStringTypes() throws Exception {
  final int TOTAL_RECORDS = 1 * 10;
  String table = getTableName().toUpperCase();
  ColumnGenerator[] cols = new ColumnGenerator[] {
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(0),
      "char(14)", Types.CHAR, HCatFieldSchema.Type.STRING, 0, 0,
      "string to test", "string to test", KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(1),
        "char(14)", Types.CHAR, HCatFieldSchema.Type.CHAR, 14, 0,
        new HiveChar("string to test", 14), "string to test",
        KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(2),
        "char(14)", Types.CHAR, HCatFieldSchema.Type.VARCHAR, 14, 0,
        new HiveVarchar("string to test", 14), "string to test",
        KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(3),
      "longvarchar", Types.LONGVARCHAR, HCatFieldSchema.Type.STRING, 0, 0,
      "string to test", "string to test", KeyType.NOT_A_KEY),
  };
  List<String> addlArgsArray = new ArrayList<String>();
  setExtraArgs(addlArgsArray);
  runHCatImport(addlArgsArray, TOTAL_RECORDS, table, cols, null);
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:24,代码来源:HCatalogImportTest.java


示例4: testStringTypes

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
public void testStringTypes() throws Exception {
  final int TOTAL_RECORDS = 1 * 10;
  String table = getTableName().toUpperCase();
  ColumnGenerator[] cols = new ColumnGenerator[] {
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(0),
      "char(14)", Types.CHAR, HCatFieldSchema.Type.STRING, 0, 0,
      "string to test", "string to test", KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(1),
        "char(14)", Types.CHAR, HCatFieldSchema.Type.CHAR, 14, 0,
        new HiveChar("string to test", 14), "string to test",
        KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(2),
        "char(14)", Types.CHAR, HCatFieldSchema.Type.VARCHAR, 14, 0,
        new HiveVarchar("string to test", 14), "string to test",
        KeyType.NOT_A_KEY),
    HCatalogTestUtils.colGenerator(HCatalogTestUtils.forIdx(3),
      "longvarchar", Types.LONGVARCHAR, HCatFieldSchema.Type.STRING, 0, 0,
      "string to test", "string to test", KeyType.NOT_A_KEY),
  };
  List<String> addlArgsArray = new ArrayList<String>();
  runHCatExport(addlArgsArray, TOTAL_RECORDS, table, cols);
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:23,代码来源:HCatalogExportTest.java


示例5: getJavaObjectFromPrimitiveData

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
private static Object getJavaObjectFromPrimitiveData(Object data, ObjectInspector objInsp) {
    assert(objInsp.getCategory() == Category.PRIMITIVE);
    if (data == null) {
        return null;
    }
    if (data instanceof BytesWritable && objInsp instanceof WritableHiveDecimalObjectInspector) {
        // BytesWritable cannot be directly cast to HiveDecimalWritable
        WritableHiveDecimalObjectInspector oi = (WritableHiveDecimalObjectInspector) objInsp;
        data = oi.create(((BytesWritable) data).getBytes(), oi.scale());
    }
    Object obj = ObjectInspectorUtils.copyToStandardJavaObject(data, objInsp);
    if (obj instanceof HiveDecimal) {
        obj = ((HiveDecimal) obj).bigDecimalValue();
    } else if (obj instanceof HiveVarchar || obj instanceof HiveChar) {
        obj = obj.toString();
    } else if (obj instanceof byte[]) {
        obj = Hex.encodeHexString((byte[]) obj);
    }
    return obj;
}
 
开发者ID:EXASOL,项目名称:hadoop-etl-udfs,代码行数:21,代码来源:HdfsSerDeImportService.java


示例6: getJavaObjectFromFieldData

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
private static Object getJavaObjectFromFieldData(Object data, ObjectInspector objInsp) {
    if (data == null) {
        return null;
    }
    if (objInsp.getCategory() == Category.PRIMITIVE) {
        Object obj = ObjectInspectorUtils.copyToStandardJavaObject(data, objInsp);
        if (obj instanceof HiveDecimal) {
            obj = ((HiveDecimal) obj).bigDecimalValue();
        } else if (obj instanceof HiveVarchar || obj instanceof HiveChar) {
            obj = obj.toString();
        } else if (obj instanceof byte[]) {
            obj = Hex.encodeHexString((byte[]) obj);
        }
        return obj;
    } else if (objInsp.getCategory() == Category.LIST) {
        return getJsonArrayFromFieldData(data, objInsp, Json.createBuilderFactory(null)).build().toString();
    } else {
        return getJsonObjectFromFieldData(data, objInsp, Json.createBuilderFactory(null)).build().toString();
    }
}
 
开发者ID:EXASOL,项目名称:hadoop-etl-udfs,代码行数:21,代码来源:HdfsSerDeImportService.java


示例7: writeChar

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void writeChar() throws IOException {
  List<Object> values = new ArrayList<>();
  values.add("hello");
  values.add(new HiveChar("world", 1));
  values.add(null);

  write(TypeInfoFactory.getCharTypeInfo(1), values);

  try (OrcReader reader = getOrcReader()) {
    assertThat(reader.hasNext(), is(true));
    assertThat(((HiveChar) reader.next().get(0)).getValue(), is("h"));

    assertThat(reader.hasNext(), is(true));
    assertThat(((HiveChar) reader.next().get(0)).getValue(), is("w"));

    assertThat(reader.hasNext(), is(true));
    assertThat(reader.next().get(0), is(nullValue()));

    assertThat(reader.hasNext(), is(false));
  }
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:23,代码来源:OrcFileTest.java


示例8: writeMapCharString

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void writeMapCharString() throws IOException {
  Map<Object, Object> map = new HashMap<>();
  map.put("hello", "world");
  map.put("hi", "world");

  List<Object> values = new ArrayList<>();
  values.add(map);
  values.add(null);

  write(TypeInfoFactory.getMapTypeInfo(TypeInfoFactory.getCharTypeInfo(1), TypeInfoFactory.stringTypeInfo), values);

  Map<Object, Object> expected = new HashMap<>();
  expected.put(new HiveChar("h", 1), "world");

  try (OrcReader reader = getOrcReader()) {
    assertThat(reader.hasNext(), is(true));
    assertThat(reader.next().get(0), is((Object) expected));

    assertThat(reader.hasNext(), is(true));
    assertThat(reader.next().get(0), is(nullValue()));

    assertThat(reader.hasNext(), is(false));
  }
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:26,代码来源:OrcFileTest.java


示例9: readMapCharString

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void readMapCharString() throws IOException {
  TypeInfo typeInfo = TypeInfoFactory.getMapTypeInfo(TypeInfoFactory.getCharTypeInfo(1),
      TypeInfoFactory.stringTypeInfo);

  Map<Object, Object> map = new HashMap<>();
  map.put(new HiveChar("h", 1), "world");

  try (OrcWriter writer = getOrcWriter(typeInfo)) {
    writer.addRow(map);
    writer.addRow((Object) null);
  }

  Map<Object, Object> expected = new HashMap<>();
  expected.put("h", "world");

  List<Tuple> list = read(typeInfo);
  assertThat(list.size(), is(2));
  assertThat(list.get(0).getObject(0), is((Object) expected));
  assertThat(list.get(1).getObject(0), is(nullValue()));
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:22,代码来源:OrcFileTest.java


示例10: readCharPredicatePushdown

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void readCharPredicatePushdown() throws IOException {
  TypeInfo typeInfo = TypeInfoFactory.getCharTypeInfo(3);

  try (OrcWriter writer = getOrcWriter(typeInfo)) {
    writer.addRow(new HiveChar("foo", 3));
    writer.addRow(new HiveChar("bar", 3));
  }

  StructTypeInfo structTypeInfo = new StructTypeInfoBuilder().add("a", typeInfo).build();

  SearchArgument searchArgument = SearchArgumentFactory
      .newBuilder()
      .startAnd()
      .equals("a", new HiveChar("foo", 5))
      .end()
      .build();

  OrcFile orcFile = OrcFile.source().columns(structTypeInfo).schemaFromFile().searchArgument(searchArgument).build();
  Tap<?, ?, ?> tap = new Hfs(orcFile, path);

  List<Tuple> list = Plunger.readDataFromTap(tap).asTupleList();

  assertThat(list.size(), is(1));
  assertThat(list.get(0).getObject(0), is((Object) "foo"));
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:27,代码来源:OrcFileTest.java


示例11: convertStringTypes

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
private Object convertStringTypes(Object val, HCatFieldSchema hfs) {
  HCatFieldSchema.Type hfsType = hfs.getType();
  if (hfsType == HCatFieldSchema.Type.STRING
      || hfsType == HCatFieldSchema.Type.VARCHAR
      || hfsType == HCatFieldSchema.Type.CHAR) {
    String str = val.toString();
    if (doHiveDelimsReplacement) {
      str = FieldFormatter.hiveStringReplaceDelims(str,
        hiveDelimsReplacement, hiveDelimiters);
    }
    if (hfsType == HCatFieldSchema.Type.STRING) {
      return str;
    } else if (hfsType == HCatFieldSchema.Type.VARCHAR) {
      VarcharTypeInfo vti = (VarcharTypeInfo) hfs.getTypeInfo();
      HiveVarchar hvc = new HiveVarchar(str, vti.getLength());
      return hvc;
    } else if (hfsType == HCatFieldSchema.Type.CHAR) {
      CharTypeInfo cti = (CharTypeInfo) hfs.getTypeInfo();
      HiveChar hc = new HiveChar(val.toString(), cti.getLength());
      return hc;
    }
  } else if (hfsType == HCatFieldSchema.Type.DECIMAL) {
    BigDecimal bd = new BigDecimal(val.toString(), MathContext.DECIMAL128);
    HiveDecimal hd = HiveDecimal.create(bd);
    return hd;
  }
  return null;
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:29,代码来源:SqoopHCatImportHelper.java


示例12: convertBooleanTypes

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
private Object convertBooleanTypes(Object val, HCatFieldSchema hfs) {
  HCatFieldSchema.Type hfsType = hfs.getType();
  Boolean b = (Boolean) val;
  if (hfsType == HCatFieldSchema.Type.BOOLEAN) {
    return b;
  } else if (hfsType == HCatFieldSchema.Type.TINYINT) {
    return (byte) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.SMALLINT) {
    return (short) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.INT) {
    return (int) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.BIGINT) {
    return (long) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.FLOAT) {
    return (float) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.DOUBLE) {
    return (double) (b ? 1 : 0);
  } else if (hfsType == HCatFieldSchema.Type.STRING) {
    return val.toString();
  } else if (hfsType == HCatFieldSchema.Type.VARCHAR) {
    VarcharTypeInfo vti = (VarcharTypeInfo) hfs.getTypeInfo();
    HiveVarchar hvc = new HiveVarchar(val.toString(), vti.getLength());
    return hvc;
  } else if (hfsType == HCatFieldSchema.Type.CHAR) {
    CharTypeInfo cti = (CharTypeInfo) hfs.getTypeInfo();
    HiveChar hChar = new HiveChar(val.toString(), cti.getLength());
    return hChar;
  }
  return null;
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:31,代码来源:SqoopHCatImportHelper.java


示例13: extractValue

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Override
public String extractValue(final Object data, final ObjectInspector objectInspector)
    throws HiveException {
  final Object value = inputObjectInspector.getPrimitiveJavaObject(data);
  if (value instanceof String) {
    return (String) value;
  } else if (value instanceof HiveChar) {
    return ((HiveChar) value).getValue();
  } else if (value instanceof HiveVarchar) {
    return ((HiveVarchar) value).getValue();
  } else {
    throw new UDFArgumentTypeException(0, "unsupported type " + value.getClass().getName());
  }
}
 
开发者ID:DataSketches,项目名称:sketches-hive,代码行数:15,代码来源:DataToStringsSketchUDAF.java


示例14: toComparable

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
static Comparable<?> toComparable(PrimitiveCategory category, Object literal) {
  String stringLiteral;
  switch (category) {
  case STRING:
    return new Text((String) literal);
  case BOOLEAN:
    return new BooleanWritable((Boolean) literal);
  case BYTE:
    return new ByteWritable(((Long) literal).byteValue());
  case SHORT:
    return new ShortWritable(((Long) literal).shortValue());
  case INT:
    return new IntWritable(((Long) literal).intValue());
  case LONG:
    return new LongWritable((Long) literal);
  case FLOAT:
    return new FloatWritable(((Double) literal).floatValue());
  case DOUBLE:
    return new DoubleWritable((Double) literal);
  case TIMESTAMP:
    return new TimestampWritable((Timestamp) literal);
  case DATE:
    return (DateWritable) literal;
  case CHAR:
    stringLiteral = (String) literal;
    return new HiveCharWritable(new HiveChar(stringLiteral, stringLiteral.length()));
  case VARCHAR:
    stringLiteral = (String) literal;
    return new HiveVarcharWritable(new HiveVarchar(stringLiteral, stringLiteral.length()));
  case DECIMAL:
    return new HiveDecimalWritable(HiveDecimal.create((BigDecimal) literal));
  default:
    throw new IllegalArgumentException("Unsupported category: " + category);
  }
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:36,代码来源:EvaluatorFactory.java


示例15: toJava

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void toJava() throws UnexpectedTypeException {
  StructTypeInfo nested = new StructTypeInfoBuilder().add("char1", TypeInfoFactory.getCharTypeInfo(1)).build();
  TypeInfo typeInfo = new StructTypeInfoBuilder()
      .add("char1", TypeInfoFactory.getCharTypeInfo(1))
      .add("struct_char1", nested)
      .build();

  SettableStructObjectInspector inspector = (SettableStructObjectInspector) OrcStruct.createObjectInspector(typeInfo);
  Object struct = inspector.create();
  inspector.setStructFieldData(struct, inspector.getStructFieldRef("char1"),
      new HiveCharWritable(new HiveChar("a", -1)));

  SettableStructObjectInspector nestedInspector = (SettableStructObjectInspector) OrcStruct
      .createObjectInspector(nested);
  Object nestedStruct = inspector.create();
  nestedInspector.setStructFieldData(nestedStruct, nestedInspector.getStructFieldRef("char1"),
      new HiveCharWritable(new HiveChar("b", -1)));
  inspector.setStructFieldData(struct, inspector.getStructFieldRef("struct_char1"), nestedStruct);

  List<Object> list = new ArrayList<>();
  list.add(new HiveChar("a", -1));
  list.add(Arrays.asList(new HiveChar("b", -1)));

  Converter converter = factory.newConverter(inspector);

  Object convertedList = converter.toJavaObject(struct);
  assertThat(convertedList, is((Object) list));

  Object convertedStruct = converter.toWritableObject(list);
  assertThat(convertedStruct, is(struct));
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:33,代码来源:DefaultConverterFactoryTest.java


示例16: readChar

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void readChar() throws IOException {
  TypeInfo typeInfo = TypeInfoFactory.getCharTypeInfo(1);

  try (OrcWriter writer = getOrcWriter(typeInfo)) {
    writer.addRow(new HiveChar("hello", 1));
    writer.addRow((Object) null);
  }

  List<Tuple> list = read(typeInfo);
  assertThat(list.size(), is(2));
  assertThat(list.get(0).getObject(0), is((Object) "h"));
  assertThat(list.get(1).getObject(0), is(nullValue()));
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:15,代码来源:OrcFileTest.java


示例17: writeStructChar

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void writeStructChar() throws IOException {
  List<Object> struct1 = new ArrayList<>();
  struct1.add("hello");

  List<Object> struct2 = new ArrayList<>();
  struct2.add(new HiveChar("world", 1));

  List<Object> struct3 = new ArrayList<>();
  struct3.add(null);

  List<Object> values = new ArrayList<>();
  values.add(struct1);
  values.add(struct2);
  values.add(struct3);
  values.add(null);

  write(new StructTypeInfoBuilder().add("b", TypeInfoFactory.getCharTypeInfo(1)).build(), values);

  try (OrcReader reader = getOrcReader()) {
    assertThat(reader.hasNext(), is(true));
    assertThat(((HiveChar) ((List) reader.next().get(0)).get(0)).getValue(), is("h"));

    assertThat(reader.hasNext(), is(true));
    assertThat(((HiveChar) ((List) reader.next().get(0)).get(0)).getValue(), is("w"));

    assertThat(reader.hasNext(), is(true));
    assertThat(((List) reader.next().get(0)).get(0), is(nullValue()));

    assertThat(reader.hasNext(), is(true));
    assertThat(reader.next().get(0), is(nullValue()));

    assertThat(reader.hasNext(), is(false));
  }
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:36,代码来源:OrcFileTest.java


示例18: readStructChar

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void readStructChar() throws IOException {
  TypeInfo typeInfo = new StructTypeInfoBuilder().add("b", TypeInfoFactory.getCharTypeInfo(1)).build();

  List<Object> struct1 = new ArrayList<>();
  struct1.add(new HiveChar("hello", 1));

  List<Object> struct2 = new ArrayList<>();
  struct2.add(new HiveChar("world", 1));

  List<Object> struct3 = new ArrayList<>();
  struct3.add(null);

  try (OrcWriter writer = getOrcWriter(typeInfo)) {
    writer.addRow((Object) struct1);
    writer.addRow((Object) struct2);
    writer.addRow((Object) struct3);
    writer.addRow((Object) null);
  }

  List<Tuple> list = read(typeInfo);
  assertThat(list.size(), is(4));
  assertThat(list.get(0).getObject(0), is((Object) Arrays.asList("h")));
  assertThat(list.get(1).getObject(0), is((Object) Arrays.asList("w")));
  assertThat(list.get(2).getObject(0), is((Object) Arrays.asList((Object) null)));
  assertThat(list.get(3).getObject(0), is(nullValue()));
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:28,代码来源:OrcFileTest.java


示例19: writeViaTypeInfo

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
@Test
public void writeViaTypeInfo() throws IOException {
  StructTypeInfo typeInfo = new StructTypeInfoBuilder()
      .add("a", TypeInfoFactory.stringTypeInfo)
      .add("b", TypeInfoFactory.getListTypeInfo(TypeInfoFactory.stringTypeInfo))
      .add("c", TypeInfoFactory.getMapTypeInfo(TypeInfoFactory.stringTypeInfo, TypeInfoFactory.stringTypeInfo))
      .add("d", TypeInfoFactory.getCharTypeInfo(1))
      .add("e", TypeInfoFactory.getVarcharTypeInfo(1))
      .add("f", TypeInfoFactory.getDecimalTypeInfo(2, 1))
      .build();

  Fields fields = SchemaFactory.newFields(typeInfo);

  Map<String, String> map = new HashMap<>();
  map.put("C1", "C1");
  Data data = new DataBuilder(fields)
      .addTuple("A1", Arrays.asList("B1"), map, "x", "y", new BigDecimal("1.234"))
      .build();
  Tap<?, ?, ?> tap = new Hfs(OrcFile.source().columns(typeInfo).schemaFromFile().build(), path);

  Plunger.writeData(data).toTap(tap);

  try (OrcReader reader = getOrcReader()) {
    assertThat(reader.hasNext(), is(true));
    List<Object> list = reader.next();
    assertThat(list.size(), is(6));
    assertThat(list.get(0), is((Object) "A1"));
    assertThat(list.get(1), is((Object) Arrays.asList("B1")));
    assertThat(list.get(2), is((Object) map));
    assertThat(((HiveChar) list.get(3)).getValue(), is((Object) "x"));
    assertThat(((HiveVarchar) list.get(4)).getValue(), is((Object) "y"));
    assertThat(((HiveDecimal) list.get(5)).bigDecimalValue(), is((Object) new BigDecimal("1.2")));

    assertThat(reader.hasNext(), is(false));
  }
}
 
开发者ID:HotelsDotCom,项目名称:corc,代码行数:37,代码来源:OrcFileTest.java


示例20: convertDatum2Writable

import org.apache.hadoop.hive.common.type.HiveChar; //导入依赖的package包/类
public static Writable convertDatum2Writable(Datum value) {
  switch(value.kind()) {
    case INT1: return new ByteWritable(value.asByte());
    case INT2: return new ShortWritable(value.asInt2());
    case INT4: return new IntWritable(value.asInt4());
    case INT8: return new LongWritable(value.asInt8());

    case FLOAT4: return new FloatWritable(value.asFloat4());
    case FLOAT8: return new DoubleWritable(value.asFloat8());

    // NOTE: value should be DateDatum
    case DATE: return new DateWritable(value.asInt4() - DateTimeConstants.UNIX_EPOCH_JDATE);

    // NOTE: value should be TimestampDatum
    case TIMESTAMP:
      TimestampWritable result = new TimestampWritable();
      result.setTime(DateTimeUtil.julianTimeToJavaTime(value.asInt8()));
      return result;

    case CHAR: {
      String str = value.asChars();
      return new HiveCharWritable(new HiveChar(str, str.length()));
    }
    case TEXT: return new Text(value.asChars());
    case VARBINARY: return new BytesWritable(value.asByteArray());

    case NULL_TYPE: return null;
  }

  throw new TajoRuntimeException(new NotImplementedException(TypeStringEncoder.encode(value.type())));
}
 
开发者ID:apache,项目名称:tajo,代码行数:32,代码来源:WritableTypeConverter.java



注:本文中的org.apache.hadoop.hive.common.type.HiveChar类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ObjectHelper类代码示例发布时间:2022-05-21
下一篇:
Java ChunkWatchEvent类代码示例发布时间:2022-05-21
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap