We’re happy to disclose the discharge of Spark Interactive Console in Azure Toolkit for IntelliJ. This new part facilitates Spark job authoring, and allows you to run code interactively in a shell-like surroundings inside IntelliJ.
The Spark console consists of Spark Native Console and Spark Livy Interactive Session. While you run the Spark console, situations of SparkSession and SparkContext are mechanically instantiated like in Spark shell. You should use ‘spark’ to entry the SparkSession and use ‘sc’ to entry the SparkContext. The Spark native console permits you to run your code interactively and validate your code logic regionally. It’s also possible to verify your programming variables and carry out different scripting operations regionally earlier than submitting to the cluster. The Spark Livy interactive session establishes an interactive communication channel along with your cluster so you may verify on file schemas, preview knowledge, and run ad-hoc queries when you are programming your Spark job. It’s also possible to simply change the Livy interactive session in opposition to totally different Spark clusters.
The Spark console has a Language Service built-in for Scala programming. You possibly can leverage the language service options, corresponding to IntelliSense and autocomplete, to search for a Spark object (i.e., Spark context and Spark session) properties, question hive metadata, and verify on operate signatures.
A brand new function, Ship Choice to Spark Console (Ctrl + Shift + S), has been added to simplify the person expertise for accessing the Spark console. You possibly can ship a highlighted single line of code or a block of code to the console out of your most important Scala challenge. This function allows you to change easily between contexts: coding and validation or testing code within the Spark console.
Abstract of recent options
- Run Spark native console
- Run Spark Livy interactive session console
- Language service for Scala enabled within the console
- Ship chosen code to console
The addition of the Spark console is a crucial step ahead for the Azure Toolkit due to its increasing capabilities past batch job processing. This replace additionally helps interactive querying throughout native and dev/check clusters.
To run your code and, please choose and maintain Ctrl + Enter, and use the up and down arrows to browse the historical past of beforehand run code.
How you can entry
You possibly can simply begin the Spark console both from the Instruments menu or from the Scala file by right-clicking on the context menu.
For extra data, take a look at the next:
We look ahead to your feedback and suggestions. If there are any function requests, buyer asks, or solutions, please ship us a word to email@example.com. For bug submissions, please open a brand new ticket utilizing the template.