clear query| facets| time Search criteria: author:"Hyukjin Kwon".   Results from 1 to 10 from 135 (0.0s).
Loading phrases to help you
refine your search...
[SPARK-27740] JIRA status test - Spark - [issue]    Author: Hyukjin Kwon , 2019-05-16, 01:36
[SPARK-16636] Missing documentation for CalendarIntervalType type in - Spark - [issue]
...I just noticed that actually there is CalendarIntervalType but this is missing in type was initially added in SPARK-8753 but renamed to CalendarIntervalType in ...    Author: Hyukjin Kwon , 2019-05-21, 04:15
[SPARK-20938] explain() for datasources implementing CatalystScan does not show pushed predicates correctly - Spark - [issue]
...Actual pushed-down catalyst predicate expressions look not being represented correctly (but only sources filters) when we use a datasource implementing CatalystScan. For example, the below c...    Author: Hyukjin Kwon , 2019-05-21, 04:16
[SPARK-18610] greatest/leatest fails to run with string aginst date/timestamp - Spark - [issue]
...It seems Spark SQL fails to implicitly cast (or detect widen type) from string with date/timestamp.spark-sql> select greatest("2015-02-02", date("2015-01-01")) ;Error in query: cannot res...    Author: Hyukjin Kwon , 2019-05-21, 04:16
[SPARK-20840] Misleading spurious errors when there are Javadoc (Unidoc) breaks - Spark - [issue]
...Currently, when there are Javadoc breaks, this seems printing warnings as errors.For example, the actual errors were as below in    Author: Hyukjin Kwon , 2019-05-21, 04:17
[SPARK-17763] JacksonParser silently parses null as 0 when the field is not nullable - Spark - [issue]
...It seems JacksonParser parses the json when the field is not nullable, it parses some data as 0 silently. For example,val testJson = """{"nullInt":null}""" :: Nilval testSchema = StructType(...    Author: Hyukjin Kwon , 2019-05-21, 04:15
[SPARK-18246] Throws an exception before execution for unsupported types in Json, CSV and text functionailities - Spark - [issue]
...Case 1 - read.json(rdd)val rdd = spark.sparkContext.parallelize(1 to 100).map(i => s"""{"a": "str$i"}""")val schema = new StructType().add("a", CalendarIntervalType)    Author: Hyukjin Kwon , 2019-05-21, 04:36
[SPARK-15143] CSV data source is not being tested as HadoopFsRelation - Spark - [issue]
...JSON, Parquet, Text and ORC are being tested with HadoopFsRelationTest by extending this.This includes 60ish tests. CSV is not being tested with this....    Author: Hyukjin Kwon , 2019-05-21, 04:33
[SPARK-14800] Dealing with null as a value in options for each internal data source - Spark - [issue]
...This was found in, most of options in data sources throws NullPointerException when given value is null.For example, the codes below:sqlC...    Author: Hyukjin Kwon , 2019-05-21, 04:34
[SPARK-18340] Inconsistent error messages in launching scripts and hanging in sparkr script for wrong options - Spark - [issue]
...It seems there are some problems with handling wrong options as below:*spark-submit script - this one looks finespark-submit --aabbccError: Unrecognized option: --aabbccUsage: spark-submit [...    Author: Hyukjin Kwon , 2019-05-21, 04:35