Discussion:
(Strange!)getFileSystem in JVM shutdown hook throws shutdown in progress exception
Ted Yu
2010-03-10 05:47:22 UTC
Permalink
By the time run() gets executed, main() has already started shutdown.
Can you perform some action in main() - now it's empty.
Hi fellows
Below code segment add a shutdown hook to JVM, but when I got a strange
exception,
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
at java.lang.Runtime.addShutdownHook(Runtime.java:192)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at young.Main$1.run(Main.java:21)
Java doc said this exception is threw when the virtual machine is already
in
the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/),
what does this mean? Why this happen? How to fix ?
I'm really appreciate if you can try this code, and help me to figure out
what's going on here, thank you !
---------------------------------------------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.mapred.JobConf;
@SuppressWarnings("deprecation")
public class Main {
public static void main(String[] args) {
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
Path path = new Path("/temp/hadoop-young");
System.out.println("Thread run : " + path);
Configuration conf = new JobConf();
FileSystem fs;
try {
fs = path.getFileSystem(conf);
if(fs.exists(path)){
fs.delete(path);
}
} catch (Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
}
};
});
}
}
--
http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
Silllllence
2010-03-10 05:39:39 UTC
Permalink
Hi fellows
Below code segment add a shutdown hook to JVM, but when I got a strange
exception,
java.lang.IllegalStateException: Shutdown in progress
at java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
at java.lang.Runtime.addShutdownHook(Runtime.java:192)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at young.Main$1.run(Main.java:21)
Java doc said this exception is threw when the virtual machine is already in
the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/),
what does this mean? Why this happen? How to fix ?
I'm really appreciate if you can try this code, and help me to figure out
what's going on here, thank you !
---------------------------------------------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.mapred.JobConf;

@SuppressWarnings("deprecation")
public class Main {

public static void main(String[] args) {
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
Path path = new Path("/temp/hadoop-young");
System.out.println("Thread run : " + path);
Configuration conf = new JobConf();
FileSystem fs;
try {
fs = path.getFileSystem(conf);
if(fs.exists(path)){
fs.delete(path);
}
} catch (Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
}
};
});
}
}
--
View this message in context: http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
Todd Lipcon
2010-03-10 18:00:11 UTC
Permalink
Hi,

The issue here is that Hadoop itself uses a shutdown hook to close all open
filesystems when the JVM shuts down. Since JVM shutdown hooks don't have a
specified order, you shouldn't access Hadoop filesystem objects from a
shutdown hook.

To get around this you can use the fs.automatic.close configuration variable
(provided by this patch: https://issues.apache.org/jira/browse/HADOOP-4829 )
to disable the Hadoop shutdown hook. This patch is applied to CDH2 (or else
you'll have to apply it manually)

Note that if you disable the shutdown hook, you'll need to manually close
the filesystems using FileSystem.closeAll

Thanks
-Todd
Hi fellows
Below code segment add a shutdown hook to JVM, but when I got a strange
exception,
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
at java.lang.Runtime.addShutdownHook(Runtime.java:192)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at young.Main$1.run(Main.java:21)
Java doc said this exception is threw when the virtual machine is already in
the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/),
what does this mean? Why this happen? How to fix ?
I'm really appreciate if you can try this code, and help me to figure out
what's going on here, thank you !
---------------------------------------------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.mapred.JobConf;
@SuppressWarnings("deprecation")
public class Main {
public static void main(String[] args) {
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
Path path = new Path("/temp/hadoop-young");
System.out.println("Thread run : " + path);
Configuration conf = new JobConf();
FileSystem fs;
try {
fs = path.getFileSystem(conf);
if(fs.exists(path)){
fs.delete(path);
}
} catch (Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
}
};
});
}
}
--
http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
--
Todd Lipcon
Software Engineer, Cloudera
Ted Yu
2010-03-10 20:57:07 UTC
Permalink
We saw similar issue because we have our own shutdown hook.

Here is the stack trace (hadoop 0.20.1):

TERM trapped. Shutting down.
Exception in thread "Thread-6" java.lang.NullPointerException
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeThreads(DFSClient.java:3164)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(DFSClient.java:3207)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(DFSClient.java:3152)
at
org.apache.hadoop.hdfs.DFSClient$LeaseChecker.close(DFSClient.java:1032)
at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:233)
at
org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:269)
at
org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:1419)
at org.apache.hadoop.fs.FileSystem.closeAll(FileSystem.java:212)
at
org.apache.hadoop.fs.FileSystem$ClientFinalizer.run(FileSystem.java:197)
<-- Wrapper Stopped

Hopefully in 0.21.0 there would be no NPE.
Post by Todd Lipcon
Hi,
The issue here is that Hadoop itself uses a shutdown hook to close all open
filesystems when the JVM shuts down. Since JVM shutdown hooks don't have a
specified order, you shouldn't access Hadoop filesystem objects from a
shutdown hook.
To get around this you can use the fs.automatic.close configuration variable
(provided by this patch: https://issues.apache.org/jira/browse/HADOOP-4829)
to disable the Hadoop shutdown hook. This patch is applied to CDH2 (or else
you'll have to apply it manually)
Note that if you disable the shutdown hook, you'll need to manually close
the filesystems using FileSystem.closeAll
Thanks
-Todd
Hi fellows
Below code segment add a shutdown hook to JVM, but when I got a strange
exception,
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
at java.lang.Runtime.addShutdownHook(Runtime.java:192)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at young.Main$1.run(Main.java:21)
Java doc said this exception is threw when the virtual machine is already in
the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/
),
what does this mean? Why this happen? How to fix ?
I'm really appreciate if you can try this code, and help me to figure out
what's going on here, thank you !
---------------------------------------------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.mapred.JobConf;
@SuppressWarnings("deprecation")
public class Main {
public static void main(String[] args) {
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
Path path = new
Path("/temp/hadoop-young");
System.out.println("Thread run : " +
path);
Configuration conf = new JobConf();
FileSystem fs;
try {
fs = path.getFileSystem(conf);
if(fs.exists(path)){
fs.delete(path);
}
} catch (Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
}
};
});
}
}
--
http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
--
Todd Lipcon
Software Engineer, Cloudera
Ted Yu
2010-03-19 18:32:28 UTC
Permalink
I have logged a comment in
https://issues.apache.org/jira/browse/HADOOP-4829which is related to
IllegalStateException that I saw when Cache.remove()
tried to remove shutdown hook in the process of JVM shutting down.

Cheers
Post by Todd Lipcon
Hi,
The issue here is that Hadoop itself uses a shutdown hook to close all open
filesystems when the JVM shuts down. Since JVM shutdown hooks don't have a
specified order, you shouldn't access Hadoop filesystem objects from a
shutdown hook.
To get around this you can use the fs.automatic.close configuration variable
(provided by this patch: https://issues.apache.org/jira/browse/HADOOP-4829)
to disable the Hadoop shutdown hook. This patch is applied to CDH2 (or else
you'll have to apply it manually)
Note that if you disable the shutdown hook, you'll need to manually close
the filesystems using FileSystem.closeAll
Thanks
-Todd
Hi fellows
Below code segment add a shutdown hook to JVM, but when I got a strange
exception,
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
at java.lang.Runtime.addShutdownHook(Runtime.java:192)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at young.Main$1.run(Main.java:21)
Java doc said this exception is threw when the virtual machine is already in
the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/
),
what does this mean? Why this happen? How to fix ?
I'm really appreciate if you can try this code, and help me to figure out
what's going on here, thank you !
---------------------------------------------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.mapred.JobConf;
@SuppressWarnings("deprecation")
public class Main {
public static void main(String[] args) {
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
Path path = new
Path("/temp/hadoop-young");
System.out.println("Thread run : " +
path);
Configuration conf = new JobConf();
FileSystem fs;
try {
fs = path.getFileSystem(conf);
if(fs.exists(path)){
fs.delete(path);
}
} catch (Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
}
};
});
}
}
--
http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
--
Todd Lipcon
Software Engineer, Cloudera
Loading...