Step 0: Preparationselect from_unixtime(1508673584) as fut;
Result:
----------------------- | fut | | ------------------- | | 2017-10-22 11:59:44 | -----------------------
Step 1: Create a table with the result of from_unixtime()
create table test select from_unixtime(1508673584) as fut;
Step 2: Examine the datatype of the column futdescribe test;
Result:
---------------------------------- | col_name | data_type | comment | | -------- | --------- | ------- | | fut | string | <null> | ----------------------------------
I also tried this
select from_utc_timestamp(1508618794*1000, 'EDT');
However, I got an error of
Error: org.apache.spark.sql.AnalysisException: cannot resolve 'from_utc_timestamp((1508618794 * 1000), 'EDT')' due to data type mismatch: argument 1 requires timestamp type, however, '(1508618794 * 1000)' is of int type.; line 2 pos 2; 'Project [unresolvedalias(from_utc_timestamp((1508618794 * 1000), EDT), None)] +- OneRowRelation$ SQLState: null ErrorCode: 0
解决方案
(I am providing an answer myself here.)
The answer is to use cast(). This works for both date and timestamp
select from_unixtime(1508673584) as fut, cast(from_unixtime(1508673584) as date) as futAsDate, cast(from_unixtime(1508673584) as timestamp) as futAsTimestamp;
Result:
------------------------------------------------------------ | fut | futAsDate | futAsTimestamp | | ------------------- | ---------- | --------------------- | | 2017-10-22 11:59:44 | 2017-10-22 | 2017-10-22 11:59:44.0 | ------------------------------------------------------------
Verification of data types
create table test2 select from_unixtime(1508673584) as fut, cast(from_unixtime(1508673584) as date) as futAsDate, cast(from_unixtime(1508673584) as timestamp) as futAsTimestamp;
And then
describe test2;
Result:
---------------------------------------- | col_name | data_type | comment | | -------------- | --------- | ------- | | fut | string | <null> | | futAsDate | date | <null> | | futAsTimestamp | timestamp | <null> | ----------------------------------------
原文链接: