Learn how to execute and monitor Shell scripts efficiently with sudo support, working directories, and full logging control.
In this article, we’ll dive deep into how DolphinScheduler uses ProcessBuilder
to execute shell commands. By default, DolphinScheduler wraps shell scripts using BashShellInterceptorBuilder
, generating execution commands that support both standard and sudo modes.
We’ll also walk through a Spring Boot example showing how to configure working directories, merge output streams, monitor process execution, and output logs — helping you achieve efficient Shell task management and scheduling.
1. How DolphinScheduler Uses ProcessBuilder
1.1. Wrapping Shell Commands
In DolphinScheduler, the command wrapping happens inside:
org.apache.dolphinscheduler.plugin.task.api.shell.ShellInterceptorBuilderFactory
public class ShellInterceptorBuilderFactory {
private final static String INTERCEPTOR_TYPE = PropertyUtils.getString("shell.interceptor.type", "bash");
@SuppressWarnings("unchecked")
public static IShellInterceptorBuilder newBuilder() {
// Default logic
if (INTERCEPTOR_TYPE.equalsIgnoreCase("bash")) {
return new BashShellInterceptorBuilder();
}
if (INTERCEPTOR_TYPE.equalsIgnoreCase("sh")) {
return new ShShellInterceptorBuilder();
}
if (INTERCEPTOR_TYPE.equalsIgnoreCase("cmd")) {
return new CmdShellInterceptorBuilder();
}
throw new IllegalArgumentException("Unsupported shell type: " + INTERCEPTOR_TYPE);
}
}
✅ By default, BashShellInterceptorBuilder
is used.
Here’s the relevant part:
org.apache.dolphinscheduler.plugin.task.api.shell.bash.BashShellInterceptorBuilder
public class BashShellInterceptorBuilder
extends BaseLinuxShellInterceptorBuilder<BashShellInterceptorBuilder, BashShellInterceptor> {
@Override
public BashShellInterceptorBuilder newBuilder() {
return new BashShellInterceptorBuilder();
}
@Override
public BashShellInterceptor build() throws FileOperateException, IOException {
// Core part: Generate shell script
generateShellScript();
List<String> bootstrapCommand = generateBootstrapCommand();
// Instantiate BashShellInterceptor
return new BashShellInterceptor(bootstrapCommand, shellDirectory);
}
@Override
protected String shellInterpreter() {
return "bash";
}
@Override
protected String shellExtension() {
return ".sh";
}
@Override
protected String shellHeader() {
return "#!/bin/bash";
}
}
Command generation happens here:
org.apache.dolphinscheduler.plugin.task.api.shell.BaseLinuxShellInterceptorBuilder#generateBootstrapCommand
protected List<String> generateBootstrapCommand() {
if (sudoEnable) {
// sudo -u tenant -i /opt/xx.sh
return bootstrapCommandInSudoMode();
}
// bash /opt/xx.sh
return bootstrapCommandInNormalMode();
}
Two paths:
- Normal Mode (
bash /path/to/script.sh
) - Sudo Mode (
sudo -u tenant -i /path/to/script.sh
)
Example of sudo mode:
private List<String> bootstrapCommandInSudoMode() {
List<String> bootstrapCommand = new ArrayList<>();
bootstrapCommand.add("sudo");
if (StringUtils.isNotBlank(runUser)) {
bootstrapCommand.add("-u");
bootstrapCommand.add(runUser);
}
bootstrapCommand.add("-i");
bootstrapCommand.add(shellAbsolutePath().toString());
return bootstrapCommand;
}
1.2. Executing the Shell Command
The real execution is handled here:
org.apache.dolphinscheduler.plugin.task.api.shell.BaseShellInterceptor
public abstract class BaseShellInterceptor implements IShellInterceptor {
protected final String workingDirectory;
protected final List<String> executeCommands;
protected BaseShellInterceptor(List<String> executeCommands, String workingDirectory) {
this.executeCommands = executeCommands;
this.workingDirectory = workingDirectory;
}
@Override
public Process execute() throws IOException {
ProcessBuilder processBuilder = new ProcessBuilder();
processBuilder.directory(new File(workingDirectory)); // Set working directory (important for loading JARs etc.)
processBuilder.redirectErrorStream(true); // Merge error stream into output stream
processBuilder.command(executeCommands);
log.info("Executing shell command: {}", String.join(" ", executeCommands));
return processBuilder.start();
}
}
2. Practical Example: Shell Scheduling Made Easy
Let’s build a simple Spring Boot application that launches a Shell script!
2.1. pom.xml Dependency
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<version>2.6.1</version>
</dependency>
2.2. Spring Boot Application Code
@SpringBootApplication
public class Application {
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
List<String> executeCommands = new ArrayList<>();
executeCommands.add("sudo");
executeCommands.add("-u");
executeCommands.add("qiaozhanwei");
executeCommands.add("-i");
executeCommands.add("/opt/test/my.sh");
ProcessBuilder processBuilder = new ProcessBuilder();
processBuilder.directory(new File("/opt/test")); // Set working directory
processBuilder.redirectErrorStream(true); // Merge error output into standard output
processBuilder.command(executeCommands);
Process process = processBuilder.start();
try (BufferedReader inReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
while ((line = inReader.readLine()) != null) {
// Print each line of output
System.out.println(line);
}
} catch (Exception e) {
e.printStackTrace();
}
// Wait for 10 minutes maximum
boolean status = process.waitFor(10, TimeUnit.MINUTES);
System.out.println("Execution status -> " + status);
}
}
2.3. Example Log Output
The logs show:
- The Shell script was executed successfully
- Standard and error outputs were captured
- The process finished within 10 minutes
Snippet:
Executing shell command : sudo -u qiaozhanwei -i /opt/test/my.sh
...
Job job_1694766249884_0931 completed successfully
status ->true
Process finished with exit code 0
Final Thoughts
Using DolphinScheduler + ProcessBuilder, you can flexibly handle Shell task execution with full control over:
- Working directories
- Error/standard output streams
- Sudo permissions
- Timeout monitoring
This approach is simple, robust, and easy to extend — a must-have skill if you’re managing Shell-based workflows in big data or DevOps scenarios!