From 9c530576a44cbeb956db94e7fdd1fad50bd62973 Mon Sep 17 00:00:00 2001
From: Dongjoon Hyun <dongjoon@apache.org>
Date: Wed, 13 Jul 2016 22:24:26 -0700
Subject: [PATCH] [SPARK-16536][SQL][PYSPARK][MINOR] Expose `sql` in PySpark
 Shell

## What changes were proposed in this pull request?

This PR exposes `sql` in PySpark Shell like Scala/R Shells for consistency.

**Background**
 * Scala
 ```scala
scala> sql("select 1 a")
res0: org.apache.spark.sql.DataFrame = [a: int]
```

 * R
 ```r
> sql("select 1")
SparkDataFrame[1:int]
```

**Before**
 * Python

 ```python
>>> sql("select 1 a")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sql' is not defined
```

**After**
 * Python

 ```python
>>> sql("select 1 a")
DataFrame[a: int]
```

## How was this patch tested?

Manual.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #14190 from dongjoon-hyun/SPARK-16536.
---
 python/pyspark/shell.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index ac5ce87a3f..c1917d2be6 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -49,6 +49,7 @@ except TypeError:
     spark = SparkSession.builder.getOrCreate()
 
 sc = spark.sparkContext
+sql = spark.sql
 atexit.register(lambda: sc.stop())
 
 # for compatibility
-- 
GitLab