Improve company productivity with a Business Account.Sign Up

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 269
  • Last Modified:

getting URL content from https sites

hi

i have the follow code that works well when i try to get the content off non https sites like www.yahoo.com or www.google.com but when i  try to get the content from https sites it gives me an hostexception error. It seems that it is not able to get content from https sites. Any way to do this?
import java.net.*;
import java.io.*;
 
public class jget 
{
  public static void main ( String[] args ) throws IOException 
  {
    try 
    {
        URL url = new URL("http://www.yahoo.com");
    
        BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
        String str;
 
        while ((str = in.readLine()) != null) 
        {
          System.out.println(str);
        }
 
        in.close();
    } 
    catch (MalformedURLException e) {} 
    catch (IOException e) {}
  }
}

Open in new window

0
jaxrpc
Asked:
jaxrpc
1 Solution
 
Dejan PažinHead of SW DevelopmentCommented:


Add this to your code:

System.setProperty("java.protocol.handler.pkgs",  "com.sun.net.ssl.internal.www.protocol");
Security.addProvider(new com.sun.net.ssl.internal.ssl.Provider());

Here is an explanation:
http://www.javaworld.com/javaworld/javatips/jw-javatip96.html
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now