Connection stream in java

Connection stream in java

This chapter describes how the Oracle Java Database Connectivity (JDBC) drivers handle Java streams for several data types. Data streams enable you to read LONG column data of up to 2 gigabytes (GB).

This chapter covers the following topics:

12.1 Overview of Java Streams

Oracle JDBC drivers support the manipulation of data streams in either direction between server and client. The drivers support all stream conversions: binary, ASCII, and Unicode. Following is a brief description of each type of stream:

  • Binary Used for RAW bytes of data, and corresponds to the getBinaryStream method
  • ASCII Used for ASCII bytes in ISO-Latin-1 encoding, and corresponds to the getAsciiStream method
  • Unicode Used for Unicode bytes with the UTF-16 encoding, and corresponds to the getUnicodeStream method

The getBinaryStream , getAsciiStream , and getUnicodeStream methods return the bytes of data in an InputStream object.

Starting from Oracle Database 12 c Release 1 (12.1), the CONNECTION_PROPERTY_STREAM_CHUNK_SIZE is deprecated and the driver does not use it internally for setting the stream chunk size.

12.2 About Streaming LONG or LONG RAW Columns

This section covers the following topics:

Читайте также:  Java rest server json

12.2.1 Overview of Streaming LONG or LONG RAW Columns

When a query selects one or more LONG or LONG RAW columns, the JDBC driver transfers these columns to the client in streaming mode. In streaming mode, the JDBC driver does not read the column data from the network for LONG or LONG RAW columns, until required. The column data remains in the network communications channel until your code calls a getXXX method to read the column data. Even after the call, the column data is read only as needed to populate return value from the getXXX call. Because the column data remains in the communications channel, the streaming mode interferes with all other use of the connection. Any use of the connection, other than reading the column data, will discard the column data from the channel. While the streaming mode makes efficient use of memory and minimizes network round trips, it interferes with many other database operations.

Oracle recommends avoiding LONG and LONG RAW columns. Use LOB instead.

To access the data in a LONG column, you can get the column as a Java InputStream object and use the read method of the InputStream object. As an alternative, you can get the data as a String or byte array. In this case, the driver will do the streaming for you.

You can get LONG and LONG RAW data with any of the three stream types. The driver performs conversions for you, depending on the character set of the database and the driver.

Do not create tables with LONG columns. Use large object (LOB) columns, CLOB , NCLOB , and BLOB , instead. LONG columns are supported only for backward compatibility. Oracle recommends that you convert existing LONG columns to LOB columns. LOB columns are subject to far fewer restrictions than LONG columns.

12.2.2 LONG RAW Data Conversions

A call to getBinaryStream returns RAW data. A call to getAsciiStream converts the RAW data to hexadecimal and returns the ASCII representation. A call to getUnicodeStream converts the RAW data to hexadecimal and returns the Unicode characters.

12.2.3 LONG Data Conversions

When you get LONG data with getAsciiStream , the drivers assume that the underlying data in the database uses an US7ASCII or WE8ISO8859P1 character set. If the assumption is true, then the drivers return bytes corresponding to ASCII characters. If the database is not using an US7ASCII or WE8ISO8859P1 character set, a call to getAsciiStream returns meaningless information.

When you get LONG data with getUnicodeStream , you get a stream of Unicode characters in the UTF-16 encoding. This applies to all underlying database character sets that Oracle supports.

When you get LONG data with getBinaryStream , there are two possible cases:

  • If the driver is JDBC OCI and the client character set is not US7ASCII or WE8ISO8859P1 , then a call to getBinaryStream returns UTF-8 . If the client character set is US7ASCII or WE8ISO8859P1 , then the call returns a US7ASCII stream of bytes.
  • If the driver is JDBC Thin and the database character set is not US7ASCII or WE8ISO8859P1 , then a call to getBinaryStream returns UTF-8 . If the server-side character set is US7ASCII or WE8ISO8859P1 , then the call returns a US7ASCII stream of bytes.

Receiving LONG or LONG RAW columns as a stream requires you to pay special attention to the order in which you retrieve columns from the database.

The following table summarizes LONG and LONG RAW data conversions for each stream type.

Table 12-1 LONG and LONG RAW Data Conversions

Bytes representing characters in Unicode UTF-8 . The bytes can represent characters in US7ASCII or WE8ISO8859P1 if the database character set is US7ASCII or WE8ISO8859P1 .

Bytes representing characters in ISO-Latin-1 ( WE8ISO8859P1 ) encoding

Bytes representing characters in Unicode UTF-16 encoding

ASCII representation of hexadecimal bytes

Unicode representation of hexadecimal bytes

Related Topics

12.2.4 Examples:Streaming LONG RAW Data

One of the features of a get XXX Stream method is that it enables you to fetch data incrementally. In contrast, getBytes fetches all the data in one call. This section contains two examples of getting a stream of binary data. The first version uses the getBinaryStream method to obtain LONG RAW data, and the second version uses the getBytes method.

Getting a LONG RAW Data Column with getBinaryStream

This example writes the contents of a LONG RAW column to a file on the local file system. In this case, the driver fetches the data incrementally.

The following code creates the table that stores a column of LONG RAW data associated with the name LESLIE:

-- SQL code: create table streamexample (NAME varchar2 (256), GIFDATA long raw); insert into streamexample values ('LESLIE', '00010203040506070809');

The following Java code snippet writes the data from the LONG RAW column into a file called leslie.gif:

ResultSet rset = stmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // get first row if (rset.next()) < // Get the GIF data as a stream from Oracle to the client InputStream gif_data = rset.getBinaryStream (1); try < FileOutputStream file = null; file = new FileOutputStream ("leslie.gif"); int chunk; while ((chunk = gif_data.read()) != -1) file.write(chunk); >catch (Exception e) < String err = e.toString(); System.out.println(err); >finally < if file != null() file.close(); >>

In this example, the InputStream object returned by the call to getBinaryStream reads the data directly from the database connection.

Getting a LONG RAW Data Column with getBytes

This example gets the content of the GIFDATA column with getBytes instead of getBinaryStream . In this case, the driver fetches all the data in one call and stores it in a byte array. The code snippet is as follows:

ResultSet rset2 = stmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // get first row if (rset2.next()) < // Get the GIF data as a stream from Oracle to the client byte[] bytes = rset2.getBytes(1); try < FileOutputStream file = null; file = new FileOutputStream ("leslie2.gif"); file.write(bytes); >catch (Exception e) < String err = e.toString(); System.out.println(err); >finally < if file != null() file.close(); >>

Because a LONG RAW column can contain up to 2 gigabytes of data, the getBytes example can use much more memory than the getBinaryStream example. Use streams if you do not know the maximum size of the data in your LONG or LONG RAW columns.

12.2.5 About Avoiding Streaming for LONG or LONG RAW

Starting from Oracle Database 12 c Release 1 (12.1), this method is deprecated.

The JDBC driver automatically streams any LONG and LONG RAW columns. However, there may be situations where you want to avoid data streaming. For example, if you have a very small LONG column, then you may want to avoid returning the data incrementally and, instead, return the data in one call.

To avoid streaming, use the defineColumnType method to redefine the type of the LONG column. For example, if you redefine the LONG or LONG RAW column as VARCHAR or VARBINARY type, then the driver will not automatically stream the data.

If you redefine column types with defineColumnType , then you must declare the types of the columns in the query. If you do not declare the types of the columns, then executeQuery will fail. In addition, you must cast the Statement object to oracle.jdbc.OracleStatement .

As an added benefit, using defineColumnType saves the OCI driver a database round-trip when running the query. Without defineColumnType , these JDBC drivers must request the data types of the column types. The JDBC Thin driver derives no benefit from defineColumnType , because it always uses the minimum number of round-trips.

Using the example from the previous section, the Statement object stmt is cast to OracleStatement and the column containing LONG RAW data is redefined to be of the type VARBINARAY . The data is not streamed. Instead, it is returned in a byte array. The code snippet is as follows:

//cast the statement stmt to an OracleStatement oracle.jdbc.OracleStatement ostmt = (oracle.jdbc.OracleStatement)stmt; //redefine the LONG column at index position 1 to VARBINARY ostmt.defineColumnType(1, Types.VARBINARY); // Do a query to get the images named 'LESLIE' ResultSet rset = ostmt.executeQuery ("select GIFDATA from streamexample where NAME='LESLIE'"); // The data is not streamed here rset.next(); byte [] bytes = rset.getBytes(1);

Related Topics

Источник

Java — IO — Connection (Stream and Channel)

Java Conceptuel Diagram

Java Conceptuel Diagram

In order to perform I/O operations (for example reading or writing), you need to perform a connection.

In Java, this connection are modelled through:

They are classes (and methods) that opens connection to an entity such as:

that is capable of performing one or more distinct I/O operations, for example reading or writing.

A stream or channel can be seen as a sequential byte (read and write operations):

close (release) the (entity|channel|connection) when a condition is met (End of File, Until some address)

I/O Method

Stream Operations

Chained: Streams of the same direction (Input|Ouptut) can be chained to another by passing it to the constructor of some second stream.

Concatenated via a SequenceInputStream (A SequenceInputStream represents the logical concatenation of other input streams)

Piped Data is read from a (PipedInputStream|PipedReader) object by one thread and data is written to the corresponding (PipedOutputStream|PipedWriter) by some other thread.

Architecture

All other stream types are built on the byte streams FileInputStream and FileOutputStream (reading/writing file one byte at a time) See Java — IO — Byte Stream

Byte

Data Type

Characters

Java Primitive

DataInputStream: A data input stream lets an application read primitive Java data types from an underlying input stream in a machine-independent way.

DataOutputStream: A data output stream lets an application write primitive Java data types to an output stream in a portable way.

Java Object

ObjectInputStream: An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream.

ObjectOutputStream: An ObjectOutputStream writes primitive data types and graphs of Java objects to an OutputStream.

Filter

Filtered Streams = Transformation or More Functionalities on Streams.

A Filter(Input|Output)Stream uses (Input|Output) stream as its (source|target) of data:

The (superclass|abstract class) are:

Standard Stream

Management Operations

Close

Closing a stream when it’s no longer needed is very important.

The use of a finally block to guarantee that streams will be closed even if an error occurs and helps avoid serious resource leaks.

try < in = new FileInputStream(Parameters.FILE_PATH_READ); out = new FileOutputStream(Parameters.FILE_PATH_WRITE); int c; while ((c = in.read()) != -1) < out.write(c); >> finally < if (in != null) < in.close(); >if (out != null) < out.close(); >> 

One stream can be chained to another by passing it to the constructor of some second stream. When this second stream is closed, then it automatically closes the original underlying stream as well.

If multiple streams are chained together, then closing the one which was the last to be constructed, and is thus at the highest level of abstraction, will automatically close all the underlying streams. So, one only has to call close on one stream in order to close an entire series of related streams.

You should close the outermost OutputStream or Writer you have created from the socket output stream.

When a stream is closed, the connection between the stream and the entity (mostly a file) is canceled. After you have closed a stream, you cannot perform any additional operations on it.

Источник

Оцените статью