-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-23078] add tests for Spark Thrift Server on Kubernetes #38
base: master
Are you sure you want to change the base?
Conversation
woops, looks like merging from master broke my function... |
Any way to get Jenkins to pull spark from my branch? |
@@ -84,11 +87,21 @@ | |||
<artifactId>log4j</artifactId> | |||
<version>${log4j.version}</version> | |||
</dependency> | |||
<dependency> | |||
<groupId>org.apache.hadoop</groupId> | |||
<artifactId>hadoop-common</artifactId> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you need to depend on hadoop-common
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Without it I get java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
. In general, I believe Hive JDBC requires some Hadoop dependencies, as discussed in eg https://issues.apache.org/jira/browse/HIVE-15110. Not sure if it's only configuration. Do you have a cleaner solution?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, thanks for the info.
@@ -20,6 +20,8 @@ import java.io.File | |||
import java.nio.file.{Path, Paths} | |||
import java.util.UUID | |||
import java.util.regex.Pattern | |||
import java.sql.DriverManager | |||
import org.apache.hive.jdbc.HiveDriver |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There should be an empty line separating java imports and the rest, and the order of imports should be java, scala, other third-party, and finally spark. So org.apache.hive.jdbc.HiveDriver
should be to after io.fabric8...
.
@@ -239,6 +245,43 @@ private[spark] class KubernetesSuite extends FunSuite with BeforeAndAfterAll wit | |||
appLocator) | |||
} | |||
|
|||
private def runThriftServerAndVerifyQuery( | |||
driverPodChecker: Pod => Unit = doBasicDriverPodCheck, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indention for function parameters is wrong. Should be four spaces.
assert(resultSet.getInt(1) == 42) | ||
} finally { | ||
statement.close() | ||
connection.close() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
connection
should be closed regardless of if statement.close
throws an exception.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair enough. I got this from https://github.com/apache/spark/blob/master/sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/JdbcConnectionUriSuite.scala#L66, worth fixing?
@liyinan926 thanks for checking this out, I'm still somewhat new to Scala, let me know if anything else needs to be fixed |
@@ -239,6 +245,46 @@ private[spark] class KubernetesSuite extends FunSuite with BeforeAndAfterAll wit | |||
appLocator) | |||
} | |||
|
|||
private def runThriftServerAndVerifyQuery( | |||
driverPodChecker: Pod => Unit = doBasicDriverPodCheck, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These should be 4 spaces indention.
LGTM with a couple of minor comments. BTW: the added test failed in the last integration test run somehow. |
@liyinan926 Thanks. The test succeeds using my fork of the Spark repo, and presumably should be merged together with apache/spark#20272. Is there a way to tell Jenkins to test my fork before this? |
|
I ran it that way successfully on my own machine, just wanted to make it ”official” on Jenkins |
@ozzieba, looks like the test is failing here. Can you PTAL? |
Can one of the admins verify this patch? |
Adding tests in response to apache/spark#20272