提交 46cdafb3 authored 作者: Thomas Mueller's avatar Thomas Mueller

--no commit message

--no commit message
上级 4656d167
...@@ -37,13 +37,15 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -37,13 +37,15 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
<h3>Version 1.0 (Current)</h3> <h3>Version 1.0 (Current)</h3>
<h3>Version 1.0 / TODO (Build TODO)</h3><ul> <h3>Version 1.0 / TODO (Build TODO)</h3><ul>
<li>CREATE TABLE AS SELECT .. UNION .. did not work. Fixed. <li>PostgreSQL compatibility: SET SEARCH_PATH, SERIAL, CURRENT_USER, E'text', $1.
</li><li>In some situations, when many tables with LOB columns were modified (ALTER TABLE), large objects were deleted. Fixed.
</li><li>CREATE TABLE AS SELECT .. UNION .. did not work. Fixed.
</li><li>New column ID for INFORMATION_SCHEMA.INDEXES, SEQUENCES, USERS, ROLES, RIGHTS, </li><li>New column ID for INFORMATION_SCHEMA.INDEXES, SEQUENCES, USERS, ROLES, RIGHTS,
FUNCTION_ALIASES, SCHEMATA, VIEWS, CONSTRAINTS, CONSTANTS, DOMAINS, TRIGGERS. FUNCTION_ALIASES, SCHEMATA, VIEWS, CONSTRAINTS, CONSTANTS, DOMAINS, TRIGGERS.
</li><li>If large result sets (backed by a temporary file) where not closed, the file was not deleted. </li><li>If large result sets (backed by a temporary file) where not closed, the file was not deleted.
Now, the default result set type is FETCH_FORWARD. This means temp files are deleted Now, the default result set type is FETCH_FORWARD. This means temp files are deleted
automatically (without having to close the result set explicitly). But it also means automatically (without having to close the result set explicitly). But it also means
ResultSet.beforeFirst can only be called for scrollable result sets. To create a scrollable resut set, ResultSet.beforeFirst can only be called for scrollable result sets. To create a scrollable result set,
use Statement stat = conn.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE, ResultSet.CONCUR_READ_ONLY). use Statement stat = conn.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE, ResultSet.CONCUR_READ_ONLY).
</li><li>PreparedStatement.getMetaData is now implemented. </li><li>PreparedStatement.getMetaData is now implemented.
</li><li>Now PreparedStatement.setBigDecimal(..) can be called with an object of a derived class </li><li>Now PreparedStatement.setBigDecimal(..) can be called with an object of a derived class
...@@ -719,104 +721,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -719,104 +721,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
C-style block comments /* */ are not parsed correctly when they contain * or / C-style block comments /* */ are not parsed correctly when they contain * or /
</li></ul> </li></ul>
<h3>Version 0.9 / 2006-07-14 (Build 16)</h3><ul>
<li>
The regression tests are no longer included in the jar file. This reduces the size by about 200 KB.
</li><li>
Fixed some bugs in the CSV tool. This tool should now work for most cases, but is still not fully tested.
</li><li>
The cache size is now measured in blocks and no longer in rows. Manually setting
the cache size is no longer necessary in most cases.
</li><li>
Objects of unknown type are no longer serialized to a byte array (and deserialized when calling getObject on a byte array data type)
by default. This behavior can be changed with Constants.SERIALIZE_JAVA_OBJECTS = true
</li><li>
New column IS_GENERATED in the metadata tables SEQUENCES and INDEXES
</li><li>
Optimization: deterministic subqueries are evaluated only once.
</li><li>
An exception was thrown if a scalar subquery returned no rows. Now the NULL value is used in this case.
</li><li>
IF EXISTS / IF NOT EXISTS implemented for the remaining CREATE / DROP statements.
</li><li>
ResultSetMetaData.isNullable is now implemented.
</li><li>
LIKE ... ESCAPE: The escape character may now also be an expression.
</li><li>
Compatibility: TRIM(whitespace FROM string)
</li><li>
Compatibility: SUBSTRING(string FROM start FOR length)
</li><li>
CREATE VIEW now supports a column list: CREATE VIEW TEST_V(A, B) AS ...
</li><li>
Compatibility: 'T', 'Y', 'YES', 'F', 'N', 'NO' (case insensitive) can now also be converted to boolean.
This is allowed now: WHERE BOOLEAN_FIELD='T'=(ID>1)
</li><li>
Optimization: data conversion of constants was not optimized. This is done now.
</li><li>
Compatibility: Implemented a shortcut version to declare single column referential integrity:
CREATE TABLE TEST(ID INT PRIMARY KEY, PARENT INT REFERENCES TEST)
</li><li>
Issue #126: It is possible to create multiple primary keys for the same table.
</li><li>
Issue #125: Foreign key constraints of local temporary tables are not dropped when the table is dropped.
</li><li>
Issue #124: Adding a column didn't work when the table contains a referential integrity check.
</li><li>
Issue #123: The connection to the server is lost if an abnormal exception occurs.
Example SQL statement: select 1=(1,2)
</li><li>
The H2 Console didn't parse statements containing '-' or '/' correctly. Fixed.
Now uses the same facility to split a script into SQL statements for
the RunScript tool, for the RUNSCRIPT command and for the H2 Console.
</li><li>
DatabaseMetaData.getTypeInfo: BIGINT was returning AUTO_INCREMENT=TRUE, which is wrong. Fixed.
</li></ul>
<h3>Version 0.9 / 2006-07-01 (Build 14)</h3><ul>
<li>
After dropping constraints and altering a table sometimes the database could not be opened. Fixed.
</li><li>
Outer joins did not always use an index even if this was possible. Fixed.
</li><li>
Issue #122: Using OFFSET in big result sets (disk buffered result sets) did not work. Fixed.
</li><li>
Support DatabaseMetaData.getSuperTables (currently returns an empty result set in every case).
</li><li>
Database names are no longer case sensitive for the Windows operating system,
because there the files names are not case sensitive.
</li><li>
If an index is created for a constraint, this index now belong to the constraint and is removed when removing the constraint.
</li><li>
Issue #121: Using a quoted table or alias name in front of a column name (SELECT "TEST".ID FROM TEST) didn't work.
</li><li>
Issue #120: Some ALTER TABLE statements didn't work when the table was in another than the main schema. Fixed.
</li><li>
Issue #119: If a table with autoincrement column is created in another schema,
it was not possible to connect to the database again.
Now opening a database first sorts the script by object type.
</li><li>
Issue #118: ALTER TABLE RENAME COLUMN doesn't work correctly. Workaround: don't use it.
</li><li>
Cache: implemented a String cache and improved the Value cache. Now uses a weak reference
to avoid OutOfMemory due to caching values.
</li><li>
Server: changed the public API a bit to allow an application to deal easier with start problems.
Now instead of Server.startTcpServer(args) use Server.createTcpServer(args).start();
</li><li>
Issue #117: Server.start...Server sometimes returned before the server was started. Solved.
</li><li>
Issue #116: Server: reduces memory usage. Reduced number of cached objects per connection.
</li><li>
Improved trace messages, and trace now starts earlier (when opening the database).
</li><li>
Simplified translation of the Web Console (a tool to convert the translation files to UTF-8).
</li><li>
Newsfeed sample application (used to create the newsfeed and newsletter).
</li><li>
New functions: MEMORY_FREE() and MEMORY_USED().
</li></ul>
<h3>Version 0.9 / 2005-12-13</h3><ul> <h3>Version 0.9 / 2005-12-13</h3><ul>
<li> <li>
First public release. First public release.
...@@ -879,6 +783,7 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -879,6 +783,7 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
<h3>Priority 2</h3> <h3>Priority 2</h3>
<ul> <ul>
<li>Support OSGi: http://oscar-osgi.sourceforge.net, http://incubator.apache.org/felix/index.html <li>Support OSGi: http://oscar-osgi.sourceforge.net, http://incubator.apache.org/felix/index.html
</li><li>Clustering: recovery needs to becomes fully automatic. Global write lock feature.
</li><li>System table / function: cache usage </li><li>System table / function: cache usage
</li><li>Connection pool manager </li><li>Connection pool manager
</li><li>Set the database in an 'exclusive' mode (restrict to one user at a time) </li><li>Set the database in an 'exclusive' mode (restrict to one user at a time)
...@@ -963,7 +868,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -963,7 +868,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
</li><li>http://www.jpackage.org </li><li>http://www.jpackage.org
</li><li>Version column (number/sequence and timestamp based) </li><li>Version column (number/sequence and timestamp based)
</li><li>Optimize getGeneratedKey: send last identity after each execute (server). </li><li>Optimize getGeneratedKey: send last identity after each execute (server).
</li><li>Clustering: recovery needs to becomes fully automatic.
</li><li>Date: default date is '1970-01-01' (is it 1900-01-01 in the standard / other databases?) </li><li>Date: default date is '1970-01-01' (is it 1900-01-01 in the standard / other databases?)
</li><li>Test and document UPDATE TEST SET (ID, NAME) = (SELECT ID*10, NAME || '!' FROM TEST T WHERE T.ID=TEST.ID); </li><li>Test and document UPDATE TEST SET (ID, NAME) = (SELECT ID*10, NAME || '!' FROM TEST T WHERE T.ID=TEST.ID);
</li><li>Better space re-use in the files after deleting data (shrink the files) </li><li>Better space re-use in the files after deleting data (shrink the files)
...@@ -987,7 +891,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -987,7 +891,6 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
</li><li>ValueInt.convertToString and so on (remove Value.convertTo) </li><li>ValueInt.convertToString and so on (remove Value.convertTo)
</li><li>Support custom Collators </li><li>Support custom Collators
</li><li>Document ROWNUM usage for reports: SELECT ROWNUM, * FROM (subquery) </li><li>Document ROWNUM usage for reports: SELECT ROWNUM, * FROM (subquery)
</li><li>Clustering: Reads should be randomly distributed or to a designated database on RAM </li><li>Clustering: Reads should be randomly distributed or to a designated database on RAM
</li><li>Clustering: When a database is back alive, automatically synchronize with the master </li><li>Clustering: When a database is back alive, automatically synchronize with the master
</li><li>Standalone tool to get relevant system properties and add it to the trace output. </li><li>Standalone tool to get relevant system properties and add it to the trace output.
...@@ -1029,10 +932,8 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -1029,10 +932,8 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
</li><li>Improved full text search (supports LOBs, reader / tokenizer / filter). </li><li>Improved full text search (supports LOBs, reader / tokenizer / filter).
</li><li>Performance: Update in-place </li><li>Performance: Update in-place
</li><li>Check if 'FSUTIL behavior set disablelastaccess 1' improves the performance (fsutil behavior query disablelastaccess) </li><li>Check if 'FSUTIL behavior set disablelastaccess 1' improves the performance (fsutil behavior query disablelastaccess)
</li><li>Remove finally() (almost) everywhere
</li><li>Java static code analysis: http://pmd.sourceforge.net/ </li><li>Java static code analysis: http://pmd.sourceforge.net/
</li><li>Java static code analysis: http://www.eclipse.org/tptp/ </li><li>Java static code analysis: http://www.eclipse.org/tptp/
</li><li>Java static code analysis: http://checkstyle.sourceforge.net/
</li><li>Compatibility for CREATE SCHEMA AUTHORIZATION </li><li>Compatibility for CREATE SCHEMA AUTHORIZATION
</li><li>Implement Clob / Blob truncate and the remaining functionality </li><li>Implement Clob / Blob truncate and the remaining functionality
</li><li>Maybe close LOBs after closing connection </li><li>Maybe close LOBs after closing connection
...@@ -1126,6 +1027,12 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch. ...@@ -1126,6 +1027,12 @@ Hypersonic SQL or HSQLDB. H2 is built from scratch.
</li><li>Server protocol: use challenge response authentication, but client sends hash(user+password) encrypted with response </li><li>Server protocol: use challenge response authentication, but client sends hash(user+password) encrypted with response
</li><li>Support EXEC[UTE] (doesn't return a result set, compatible to MS SQL Server) </li><li>Support EXEC[UTE] (doesn't return a result set, compatible to MS SQL Server)
</li><li>GROUP BY and DISTINCT: support large groups (buffer to disk), do not keep large sets in memory </li><li>GROUP BY and DISTINCT: support large groups (buffer to disk), do not keep large sets in memory
</li><li>Regular expression replaceAll: REGEXP_REPLACE(expression, regex, replacement)
</li><li>Regular expression LIKE: expression REGEXP matchExpression
</li><li>Support native XML data type
</li><li>Support triggers with a string property or option: SpringTrigger, OSGITrigger
</li><li>Clustering: adding a node should be very fast and without interrupting clients (very short lock)
</li><li>Updatable result sets: DatabaseMetaData.ownUpdatesAreVisible = true
</li></ul> </li></ul>
<h3>Not Planned</h3> <h3>Not Planned</h3>
......
...@@ -601,8 +601,7 @@ public class Parser { ...@@ -601,8 +601,7 @@ public class Parser {
} }
private TableFilter readSimpleTableFilter() throws SQLException { private TableFilter readSimpleTableFilter() throws SQLException {
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
Table table = getSchema().getTableOrView(session, tableName);
String alias = null; String alias = null;
if(readIf("AS")) { if(readIf("AS")) {
alias = readAliasIdentifier(); alias = readAliasIdentifier();
...@@ -691,8 +690,7 @@ public class Parser { ...@@ -691,8 +690,7 @@ public class Parser {
Merge command = new Merge(session); Merge command = new Merge(session);
currentPrepared = command; currentPrepared = command;
read("INTO"); read("INTO");
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
Table table = getSchema().getTableOrView(session, tableName);
command.setTable(table); command.setTable(table);
if (readIf("(")) { if (readIf("(")) {
Column[] columns = parseColumnList(table); Column[] columns = parseColumnList(table);
...@@ -731,8 +729,7 @@ public class Parser { ...@@ -731,8 +729,7 @@ public class Parser {
Insert command = new Insert(session); Insert command = new Insert(session);
currentPrepared = command; currentPrepared = command;
read("INTO"); read("INTO");
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
Table table = getSchema().getTableOrView(session, tableName);
command.setTable(table); command.setTable(table);
if (readIf("(")) { if (readIf("(")) {
Column[] columns = parseColumnList(table); Column[] columns = parseColumnList(table);
...@@ -793,7 +790,7 @@ public class Parser { ...@@ -793,7 +790,7 @@ public class Parser {
return top; return top;
} }
} else { } else {
String tableName = readIdentifierWithSchema(); String tableName = readIdentifierWithSchema(null);
if(readIf("(")) { if(readIf("(")) {
if(tableName.equals(RangeTable.NAME)) { if(tableName.equals(RangeTable.NAME)) {
long min = readLong(); long min = readLong();
...@@ -811,7 +808,7 @@ public class Parser { ...@@ -811,7 +808,7 @@ public class Parser {
} else if(tableName.equals("DUAL")) { } else if(tableName.equals("DUAL")) {
table = new RangeTable(mainSchema, 1, 1); table = new RangeTable(mainSchema, 1, 1);
} else { } else {
table = getSchema().getTableOrView(session, tableName); table = readTableOrView(tableName);
} }
} }
String alias = null; String alias = null;
...@@ -829,9 +826,9 @@ public class Parser { ...@@ -829,9 +826,9 @@ public class Parser {
private Prepared parseTruncate() throws SQLException { private Prepared parseTruncate() throws SQLException {
read("TABLE"); read("TABLE");
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
TruncateTable command = new TruncateTable(session, getSchema()); TruncateTable command = new TruncateTable(session);
command.setTableName(tableName); command.setTable(table);
return command; return command;
} }
...@@ -1904,6 +1901,8 @@ public class Parser { ...@@ -1904,6 +1901,8 @@ public class Parser {
} }
} else if (readIf("(")) { } else if (readIf("(")) {
return readFunction(name); return readFunction(name);
} else if("CURRENT_USER".equals(name)) {
return readFunctionWithoutParameters("USER");
} else if("CURRENT".equals(name)) { } else if("CURRENT".equals(name)) {
if(readIf("TIMESTAMP")) { if(readIf("TIMESTAMP")) {
return readFunctionWithoutParameters("CURRENT_TIMESTAMP"); return readFunctionWithoutParameters("CURRENT_TIMESTAMP");
...@@ -1916,8 +1915,7 @@ public class Parser { ...@@ -1916,8 +1915,7 @@ public class Parser {
} }
} else if("NEXT".equals(name) && readIf("VALUE")) { } else if("NEXT".equals(name) && readIf("VALUE")) {
read("FOR"); read("FOR");
String sequenceName = readIdentifierWithSchema(); Sequence sequence = readSequence();
Sequence sequence = getSchema().getSequence(sequenceName);
return new SequenceValue(sequence); return new SequenceValue(sequence);
} else if("DATE".equals(name) && currentTokenType == VALUE && currentValue.getType() == Value.STRING) { } else if("DATE".equals(name) && currentTokenType == VALUE && currentValue.getType() == Value.STRING) {
String date = currentValue.getString(); String date = currentValue.getString();
...@@ -1931,6 +1929,10 @@ public class Parser { ...@@ -1931,6 +1929,10 @@ public class Parser {
String timestamp = currentValue.getString(); String timestamp = currentValue.getString();
read(); read();
return ValueExpression.get(ValueTimestamp.getNoCopy(ValueTimestamp.parseTimestamp(timestamp))); return ValueExpression.get(ValueTimestamp.getNoCopy(ValueTimestamp.parseTimestamp(timestamp)));
} else if("E".equals(name) && currentTokenType == VALUE && currentValue.getType() == Value.STRING) {
String text = currentValue.getString();
read();
return ValueExpression.get(ValueString.get(text));
} else { } else {
return new ExpressionColumn(database, currentSelect, null, null, name); return new ExpressionColumn(database, currentSelect, null, null, name);
} }
...@@ -2489,6 +2491,7 @@ public class Parser { ...@@ -2489,6 +2491,7 @@ public class Parser {
case '+': case '+':
case '%': case '%':
case '?': case '?':
case '$':
type = CHAR_SPECIAL_1; type = CHAR_SPECIAL_1;
break; break;
case '!': case '!':
...@@ -2587,6 +2590,7 @@ public class Parser { ...@@ -2587,6 +2590,7 @@ public class Parser {
if(s.length()==1) { if(s.length()==1) {
switch(c0) { switch(c0) {
case '?': case '?':
case '$':
return PARAMETER; return PARAMETER;
case '+': case '+':
return PLUS; return PLUS;
...@@ -2757,7 +2761,7 @@ public class Parser { ...@@ -2757,7 +2761,7 @@ public class Parser {
private Column parseColumnForTable(String columnName) throws SQLException { private Column parseColumnForTable(String columnName) throws SQLException {
Column column; Column column;
if(readIf("IDENTITY")) { if(readIf("IDENTITY") || readIf("SERIAL")) {
column = new Column(columnName, Value.LONG, ValueLong.PRECISION, 0); column = new Column(columnName, Value.LONG, ValueLong.PRECISION, 0);
column.setOriginalSQL("IDENTITY"); column.setOriginalSQL("IDENTITY");
long start = 1, increment = 1; long start = 1, increment = 1;
...@@ -2828,8 +2832,7 @@ public class Parser { ...@@ -2828,8 +2832,7 @@ public class Parser {
column.setConvertNullToDefault(true); column.setConvertNullToDefault(true);
} }
if(readIf("SEQUENCE")) { if(readIf("SEQUENCE")) {
String sequenceName = readIdentifierWithSchema(); Sequence sequence = readSequence();
Sequence sequence = getSchema().getSequence(sequenceName);
column.setSequence(sequence); column.setSequence(sequence);
} }
if(readIf("SELECTIVITY")) { if(readIf("SELECTIVITY")) {
...@@ -3081,8 +3084,7 @@ public class Parser { ...@@ -3081,8 +3084,7 @@ public class Parser {
if(!isRoleBased) { if(!isRoleBased) {
if(readIf("ON")) { if(readIf("ON")) {
do { do {
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
Table table = getSchema().getTableOrView(session, tableName);
command.addTable(table); command.addTable(table);
} while(readIf(",")); } while(readIf(","));
} }
...@@ -3357,11 +3359,11 @@ public class Parser { ...@@ -3357,11 +3359,11 @@ public class Parser {
private AlterIndexRename parseAlterIndex() throws SQLException { private AlterIndexRename parseAlterIndex() throws SQLException {
String indexName = readIdentifierWithSchema(); String indexName = readIdentifierWithSchema();
Schema old = getSchema(); Schema old = getSchema();
AlterIndexRename command = new AlterIndexRename(session, getSchema()); AlterIndexRename command = new AlterIndexRename(session);
command.setOldIndex(getSchema().getIndex(indexName)); command.setOldIndex(getSchema().getIndex(indexName));
read("RENAME"); read("RENAME");
read("TO"); read("TO");
String newName = readIdentifierWithSchema(old.getSQL()); String newName = readIdentifierWithSchema(old.getName());
checkSchema(old); checkSchema(old);
command.setNewName(newName); command.setNewName(newName);
return command; return command;
...@@ -3439,27 +3441,33 @@ public class Parser { ...@@ -3439,27 +3441,33 @@ public class Parser {
throw getSyntaxError(); throw getSyntaxError();
} }
private void readIfEqualOrTo() throws SQLException {
if(!readIf("=")) {
readIf("TO");
}
}
private Prepared parseSet() throws SQLException { private Prepared parseSet() throws SQLException {
if(readIf("AUTOCOMMIT")) { if(readIf("AUTOCOMMIT")) {
readIf("="); readIfEqualOrTo();
boolean value = readBooleanSetting(); boolean value = readBooleanSetting();
int setting = value ? TransactionCommand.AUTOCOMMIT_TRUE : TransactionCommand.AUTOCOMMIT_FALSE; int setting = value ? TransactionCommand.AUTOCOMMIT_TRUE : TransactionCommand.AUTOCOMMIT_FALSE;
return new TransactionCommand(session, setting); return new TransactionCommand(session, setting);
} else if(readIf("IGNORECASE")) { } else if(readIf("IGNORECASE")) {
readIf("="); readIfEqualOrTo();
boolean value = readBooleanSetting(); boolean value = readBooleanSetting();
Set command = new Set(session, SetTypes.IGNORECASE); Set command = new Set(session, SetTypes.IGNORECASE);
command.setInt(value ? 1 : 0); command.setInt(value ? 1 : 0);
return command; return command;
} else if(readIf("PASSWORD")) { } else if(readIf("PASSWORD")) {
readIf("="); readIfEqualOrTo();
AlterUser command = new AlterUser(session); AlterUser command = new AlterUser(session);
command.setType(AlterUser.SET_PASSWORD); command.setType(AlterUser.SET_PASSWORD);
command.setUser(session.getUser()); command.setUser(session.getUser());
command.setPassword(readString()); command.setPassword(readString());
return command; return command;
} else if(readIf("SALT")) { } else if(readIf("SALT")) {
readIf("="); readIfEqualOrTo();
AlterUser command = new AlterUser(session); AlterUser command = new AlterUser(session);
command.setType(AlterUser.SET_PASSWORD); command.setType(AlterUser.SET_PASSWORD);
command.setUser(session.getUser()); command.setUser(session.getUser());
...@@ -3468,12 +3476,12 @@ public class Parser { ...@@ -3468,12 +3476,12 @@ public class Parser {
command.setHash(readString()); command.setHash(readString());
return command; return command;
} else if(readIf("MODE")) { } else if(readIf("MODE")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.MODE); Set command = new Set(session, SetTypes.MODE);
command.setString(readAliasIdentifier()); command.setString(readAliasIdentifier());
return command; return command;
} else if(readIf("COMPRESS_LOB")) { } else if(readIf("COMPRESS_LOB")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.COMPRESS_LOB); Set command = new Set(session, SetTypes.COMPRESS_LOB);
if(currentTokenType == VALUE) { if(currentTokenType == VALUE) {
command.setString(readString()); command.setString(readString());
...@@ -3482,24 +3490,24 @@ public class Parser { ...@@ -3482,24 +3490,24 @@ public class Parser {
} }
return command; return command;
} else if(readIf("DATABASE")) { } else if(readIf("DATABASE")) {
readIf("="); readIfEqualOrTo();
read("COLLATION"); read("COLLATION");
return parseSetCollation(); return parseSetCollation();
} else if(readIf("COLLATION")) { } else if(readIf("COLLATION")) {
readIf("="); readIfEqualOrTo();
return parseSetCollation(); return parseSetCollation();
} else if(readIf("CLUSTER")) { } else if(readIf("CLUSTER")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.CLUSTER); Set command = new Set(session, SetTypes.CLUSTER);
command.setString(readString()); command.setString(readString());
return command; return command;
} else if(readIf("DATABASE_EVENT_LISTENER")) { } else if(readIf("DATABASE_EVENT_LISTENER")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.DATABASE_EVENT_LISTENER); Set command = new Set(session, SetTypes.DATABASE_EVENT_LISTENER);
command.setString(readString()); command.setString(readString());
return command; return command;
} else if(readIf("ALLOW_LITERALS")) { } else if(readIf("ALLOW_LITERALS")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.ALLOW_LITERALS); Set command = new Set(session, SetTypes.ALLOW_LITERALS);
if(readIf("NONE")) { if(readIf("NONE")) {
command.setInt(Constants.ALLOW_LITERALS_NONE); command.setInt(Constants.ALLOW_LITERALS_NONE);
...@@ -3512,7 +3520,7 @@ public class Parser { ...@@ -3512,7 +3520,7 @@ public class Parser {
} }
return command; return command;
} else if(readIf("DEFAULT_TABLE_TYPE")) { } else if(readIf("DEFAULT_TABLE_TYPE")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.DEFAULT_TABLE_TYPE); Set command = new Set(session, SetTypes.DEFAULT_TABLE_TYPE);
if(readIf("MEMORY")) { if(readIf("MEMORY")) {
command.setInt(Table.TYPE_MEMORY); command.setInt(Table.TYPE_MEMORY);
...@@ -3523,56 +3531,68 @@ public class Parser { ...@@ -3523,56 +3531,68 @@ public class Parser {
} }
return command; return command;
} else if(readIf("CREATE")) { } else if(readIf("CREATE")) {
readIf("="); readIfEqualOrTo();
// Derby compatibility (CREATE=TRUE in the database URL) // Derby compatibility (CREATE=TRUE in the database URL)
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("HSQLDB.DEFAULT_TABLE_TYPE")) { } else if(readIf("HSQLDB.DEFAULT_TABLE_TYPE")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("CACHE_TYPE")) { } else if(readIf("CACHE_TYPE")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("FILE_LOCK")) { } else if(readIf("FILE_LOCK")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("STORAGE")) { } else if(readIf("STORAGE")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("DB_CLOSE_ON_EXIT")) { } else if(readIf("DB_CLOSE_ON_EXIT")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("ACCESS_MODE_LOG")) { } else if(readIf("ACCESS_MODE_LOG")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("ASSERT")) { } else if(readIf("ASSERT")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("ACCESS_MODE_DATA")) { } else if(readIf("ACCESS_MODE_DATA")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("RECOVER")) { } else if(readIf("RECOVER")) {
readIf("="); readIfEqualOrTo();
read(); read();
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("SCHEMA")) { } else if(readIf("SCHEMA")) {
readIf("="); readIfEqualOrTo();
Set command = new Set(session, SetTypes.SCHEMA); Set command = new Set(session, SetTypes.SCHEMA);
command.setString(readAliasIdentifier()); command.setString(readAliasIdentifier());
return command; return command;
} else if(readIf("DATESTYLE")) { } else if(readIf("DATESTYLE")) {
// PostgreSQL compatibility // PostgreSQL compatibility
readIf("="); readIfEqualOrTo();
read("ISO"); read("ISO");
return new NoOperation(session); return new NoOperation(session);
} else if(readIf("SEARCH_PATH") || readIf(SetTypes.getTypeName(SetTypes.SCHEMA_SEARCH_PATH))) {
readIfEqualOrTo();
Set command = new Set(session, SetTypes.SCHEMA_SEARCH_PATH);
ObjectArray list = new ObjectArray();
list.add(readAliasIdentifier());
while(readIf(",")) {
list.add(readAliasIdentifier());
}
String[] schemaNames = new String[list.size()];
list.toArray(schemaNames);
command.setStringArray(schemaNames);
return command;
} else { } else {
if(isToken("LOGSIZE")) { if(isToken("LOGSIZE")) {
// HSQLDB compatibility // HSQLDB compatibility
...@@ -3581,7 +3601,7 @@ public class Parser { ...@@ -3581,7 +3601,7 @@ public class Parser {
int type = SetTypes.getType(currentToken); int type = SetTypes.getType(currentToken);
if(type >= 0) { if(type >= 0) {
read(); read();
readIf("="); readIfEqualOrTo();
Set command = new Set(session, type); Set command = new Set(session, type);
command.setExpression(readExpression()); command.setExpression(readExpression());
return command; return command;
...@@ -3675,12 +3695,55 @@ public class Parser { ...@@ -3675,12 +3695,55 @@ public class Parser {
return command; return command;
} }
private Table readTableOrView() throws SQLException {
return readTableOrView(readIdentifierWithSchema(null));
}
private Table readTableOrView(String tableName) throws SQLException {
// same algorithm than readSequence
if(schemaName != null) {
return getSchema().getTableOrView(session, tableName);
}
Table table = database.getSchema(session.getCurrentSchemaName()).findTableOrView(session, tableName);
if(table != null) {
return table;
}
String[] schemaNames = session.getSchemaSearchPath();
for(int i=0; schemaNames != null && i<schemaNames.length; i++) {
Schema s = database.getSchema(schemaNames[i]);
table = s.findTableOrView(session, tableName);
if(table != null) {
return table;
}
}
throw Message.getSQLException(Message.TABLE_OR_VIEW_NOT_FOUND_1, tableName);
}
private Sequence readSequence() throws SQLException {
// same algorithm than readTableOrView
String sequenceName = readIdentifierWithSchema(null);
if(schemaName != null) {
return getSchema().getSequence(sequenceName);
}
Sequence sequence = database.getSchema(session.getCurrentSchemaName()).findSequence(sequenceName);
if(sequence != null) {
return sequence;
}
String[] schemaNames = session.getSchemaSearchPath();
for(int i=0; schemaNames != null && i<schemaNames.length; i++) {
Schema s = database.getSchema(schemaNames[i]);
sequence = s.findSequence(sequenceName);
if(sequence != null) {
return sequence;
}
}
throw Message.getSQLException(Message.SEQUENCE_NOT_FOUND_1, sequenceName);
}
private Prepared parseAlterTable() throws SQLException { private Prepared parseAlterTable() throws SQLException {
String tableName = readIdentifierWithSchema(); Table table = readTableOrView();
Schema tableSchema = getSchema();
Table table = getSchema().getTableOrView(session, tableName);
if(readIf("ADD")) { if(readIf("ADD")) {
Prepared command = parseAlterTableAddConstraintIf(getSchema(), tableName); Prepared command = parseAlterTableAddConstraintIf(table.getName(), table.getSchema());
if(command != null) { if(command != null) {
return command; return command;
} }
...@@ -3694,34 +3757,34 @@ public class Parser { ...@@ -3694,34 +3757,34 @@ public class Parser {
read("FALSE"); read("FALSE");
type = AlterTableAddConstraint.REFERENTIAL_INTEGRITY_FALSE; type = AlterTableAddConstraint.REFERENTIAL_INTEGRITY_FALSE;
} }
AlterTableAddConstraint command = new AlterTableAddConstraint(session, getSchema()); AlterTableAddConstraint command = new AlterTableAddConstraint(session, table.getSchema());
command.setTableName(tableName); command.setTableName(table.getName());
command.setType(type); command.setType(type);
return command; return command;
} else if(readIf("RENAME")) { } else if(readIf("RENAME")) {
read("TO"); read("TO");
String newName = readIdentifierWithSchema(tableSchema.getSQL()); String newName = readIdentifierWithSchema(table.getSchema().getName());
checkSchema(tableSchema); checkSchema(table.getSchema());
AlterTableRename command = new AlterTableRename(session, getSchema()); AlterTableRename command = new AlterTableRename(session, getSchema());
command.setOldTable(table); command.setOldTable(table);
command.setNewTableName(newName); command.setNewTableName(newName);
return command; return command;
} else if(readIf("DROP")) { } else if(readIf("DROP")) {
if(readIf("CONSTRAINT")) { if(readIf("CONSTRAINT")) {
String constraintName = readIdentifierWithSchema(tableSchema.getSQL()); String constraintName = readIdentifierWithSchema(table.getSchema().getName());
checkSchema(tableSchema); checkSchema(table.getSchema());
AlterTableDropConstraint command = new AlterTableDropConstraint(session, getSchema()); AlterTableDropConstraint command = new AlterTableDropConstraint(session, getSchema());
command.setConstraintName(constraintName); command.setConstraintName(constraintName);
return command; return command;
} else if(readIf("PRIMARY")) { } else if(readIf("PRIMARY")) {
read("KEY"); read("KEY");
Index idx = table.getPrimaryKey(); Index idx = table.getPrimaryKey();
DropIndex command = new DropIndex(session, tableSchema); DropIndex command = new DropIndex(session, table.getSchema());
command.setIndexName(idx.getName()); command.setIndexName(idx.getName());
return command; return command;
} else { } else {
readIf("COLUMN"); readIf("COLUMN");
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setType(AlterTableAlterColumn.DROP); command.setType(AlterTableAlterColumn.DROP);
String columnName = readColumnIdentifier(); String columnName = readColumnIdentifier();
command.setTable(table); command.setTable(table);
...@@ -3745,14 +3808,14 @@ public class Parser { ...@@ -3745,14 +3808,14 @@ public class Parser {
// Derby compatibility // Derby compatibility
read("TYPE"); read("TYPE");
Column newColumn = parseColumnForTable(columnName); Column newColumn = parseColumnForTable(columnName);
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setTable(table); command.setTable(table);
command.setType(AlterTableAlterColumn.CHANGE_TYPE); command.setType(AlterTableAlterColumn.CHANGE_TYPE);
command.setOldColumn(column); command.setOldColumn(column);
command.setNewColumn(newColumn); command.setNewColumn(newColumn);
return command; return command;
} }
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setTable(table); command.setTable(table);
command.setOldColumn(column); command.setOldColumn(column);
if(readIf("NULL")) { if(readIf("NULL")) {
...@@ -3771,7 +3834,7 @@ public class Parser { ...@@ -3771,7 +3834,7 @@ public class Parser {
} else if(readIf("RESTART")) { } else if(readIf("RESTART")) {
readIf("WITH"); readIf("WITH");
long start = readLong(); long start = readLong();
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setTable(table); command.setTable(table);
command.setType(AlterTableAlterColumn.RESTART); command.setType(AlterTableAlterColumn.RESTART);
command.setOldColumn(column); command.setOldColumn(column);
...@@ -3779,7 +3842,7 @@ public class Parser { ...@@ -3779,7 +3842,7 @@ public class Parser {
return command; return command;
} else if(readIf("SELECTIVITY")) { } else if(readIf("SELECTIVITY")) {
int selectivity = getPositiveInt(); int selectivity = getPositiveInt();
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setTable(table); command.setTable(table);
command.setType(AlterTableAlterColumn.SELECTIVITY); command.setType(AlterTableAlterColumn.SELECTIVITY);
command.setOldColumn(column); command.setOldColumn(column);
...@@ -3787,7 +3850,7 @@ public class Parser { ...@@ -3787,7 +3850,7 @@ public class Parser {
return command; return command;
} else { } else {
Column newColumn = parseColumnForTable(columnName); Column newColumn = parseColumnForTable(columnName);
AlterTableAlterColumn command = new AlterTableAlterColumn(session, tableSchema); AlterTableAlterColumn command = new AlterTableAlterColumn(session, table.getSchema());
command.setTable(table); command.setTable(table);
command.setType(AlterTableAlterColumn.CHANGE_TYPE); command.setType(AlterTableAlterColumn.CHANGE_TYPE);
command.setOldColumn(column); command.setOldColumn(column);
...@@ -3832,7 +3895,7 @@ public class Parser { ...@@ -3832,7 +3895,7 @@ public class Parser {
} }
} }
private Prepared parseAlterTableAddConstraintIf(Schema schema, String tableName) throws SQLException { private Prepared parseAlterTableAddConstraintIf(String tableName, Schema schema) throws SQLException {
String name = null, comment = null; String name = null, comment = null;
if(readIf("CONSTRAINT")) { if(readIf("CONSTRAINT")) {
name = readIdentifierWithSchema(schema.getName()); name = readIdentifierWithSchema(schema.getName());
...@@ -3989,7 +4052,7 @@ public class Parser { ...@@ -3989,7 +4052,7 @@ public class Parser {
read("("); read("(");
if(!readIf(")")) { if(!readIf(")")) {
do { do {
Prepared c = parseAlterTableAddConstraintIf(schema, tableName); Prepared c = parseAlterTableAddConstraintIf(tableName, schema);
if(c != null) { if(c != null) {
command.addConstraintCommand(c); command.addConstraintCommand(c);
} else { } else {
......
...@@ -13,13 +13,13 @@ import org.h2.index.Index; ...@@ -13,13 +13,13 @@ import org.h2.index.Index;
import org.h2.message.Message; import org.h2.message.Message;
import org.h2.schema.Schema; import org.h2.schema.Schema;
public class AlterIndexRename extends SchemaCommand { public class AlterIndexRename extends DefineCommand {
private Index oldIndex; private Index oldIndex;
private String newIndexName; private String newIndexName;
public AlterIndexRename(Session session, Schema schema) { public AlterIndexRename(Session session) {
super(session, schema); super(session);
} }
public void setOldIndex(Index index) { public void setOldIndex(Index index) {
...@@ -33,7 +33,8 @@ public class AlterIndexRename extends SchemaCommand { ...@@ -33,7 +33,8 @@ public class AlterIndexRename extends SchemaCommand {
public int update() throws SQLException { public int update() throws SQLException {
session.commit(true); session.commit(true);
Database db = session.getDatabase(); Database db = session.getDatabase();
if(getSchema().findIndex(newIndexName) != null || newIndexName.equals(oldIndex.getName())) { Schema schema = oldIndex.getSchema();
if(schema.findIndex(newIndexName) != null || newIndexName.equals(oldIndex.getName())) {
throw Message.getSQLException(Message.INDEX_ALREADY_EXISTS_1, newIndexName); throw Message.getSQLException(Message.INDEX_ALREADY_EXISTS_1, newIndexName);
} }
session.getUser().checkRight(oldIndex.getTable(), Right.ALL); session.getUser().checkRight(oldIndex.getTable(), Right.ALL);
......
...@@ -37,6 +37,7 @@ public class AlterTableAddConstraint extends SchemaCommand { ...@@ -37,6 +37,7 @@ public class AlterTableAddConstraint extends SchemaCommand {
private int type; private int type;
private String constraintName; private String constraintName;
private String tableName; private String tableName;
private Table table;
private String[] columnNames; private String[] columnNames;
private int deleteAction; private int deleteAction;
private int updateAction; private int updateAction;
...@@ -61,12 +62,12 @@ public class AlterTableAddConstraint extends SchemaCommand { ...@@ -61,12 +62,12 @@ public class AlterTableAddConstraint extends SchemaCommand {
public int update() throws SQLException { public int update() throws SQLException {
session.commit(true); session.commit(true);
Database db = session.getDatabase(); Database db = session.getDatabase();
table = getSchema().getTableOrView(session, tableName);
if(getSchema().findConstraint(constraintName)!=null) { if(getSchema().findConstraint(constraintName)!=null) {
throw Message.getSQLException(Message.CONSTRAINT_ALREADY_EXISTS_1, throw Message.getSQLException(Message.CONSTRAINT_ALREADY_EXISTS_1,
constraintName); constraintName);
} }
Constraint constraint; Constraint constraint;
Table table = getSchema().getTableOrView(session, tableName);
session.getUser().checkRight(table, Right.ALL); session.getUser().checkRight(table, Right.ALL);
table.lock(session, true); table.lock(session, true);
switch(type) { switch(type) {
......
/*
* Copyright 2004-2006 H2 Group. Licensed under the H2 License, Version 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.command.ddl; package org.h2.command.ddl;
import java.sql.SQLException; import java.sql.SQLException;
......
/*
* Copyright 2004-2006 H2 Group. Licensed under the H2 License, Version 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.command.ddl; package org.h2.command.ddl;
import java.sql.SQLException; import java.sql.SQLException;
......
...@@ -9,26 +9,24 @@ import java.sql.SQLException; ...@@ -9,26 +9,24 @@ import java.sql.SQLException;
import org.h2.engine.Right; import org.h2.engine.Right;
import org.h2.engine.Session; import org.h2.engine.Session;
import org.h2.message.Message; import org.h2.message.Message;
import org.h2.schema.Schema;
import org.h2.table.Table; import org.h2.table.Table;
public class TruncateTable extends SchemaCommand { public class TruncateTable extends DefineCommand {
private String tableName; private Table table;
public TruncateTable(Session session, Schema schema) { public TruncateTable(Session session) {
super(session, schema); super(session);
} }
public void setTableName(String tableName) { public void setTable(Table table) {
this.tableName = tableName; this.table = table;
} }
public int update() throws SQLException { public int update() throws SQLException {
session.commit(true); session.commit(true);
Table table = getSchema().getTableOrView(session, tableName);
if(!table.canTruncate()) { if(!table.canTruncate()) {
throw Message.getSQLException(Message.CANNOT_TRUNCATE_1, tableName); throw Message.getSQLException(Message.CANNOT_TRUNCATE_1, table.getSQL());
} else { } else {
session.getUser().checkRight(table, Right.DELETE); session.getUser().checkRight(table, Right.DELETE);
table.lock(session, true); table.lock(session, true);
......
/*
* Copyright 2004-2006 H2 Group. Licensed under the H2 License, Version 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.command.dml; package org.h2.command.dml;
import java.sql.SQLException; import java.sql.SQLException;
......
...@@ -36,6 +36,7 @@ public class Set extends Prepared { ...@@ -36,6 +36,7 @@ public class Set extends Prepared {
private int type; private int type;
private Expression expression; private Expression expression;
private String stringValue; private String stringValue;
private String[] stringValueList;
public Set(Session session, int type) { public Set(Session session, int type) {
super(session); super(session);
...@@ -240,6 +241,10 @@ public class Set extends Prepared { ...@@ -240,6 +241,10 @@ public class Set extends Prepared {
database.setOptimizeReuseResults(getIntValue() != 0); database.setOptimizeReuseResults(getIntValue() != 0);
break; break;
} }
case SetTypes.SCHEMA_SEARCH_PATH: {
session.setSchemaSearchPath(stringValueList);
break;
}
default: default:
throw Message.getInternalError("type="+type); throw Message.getInternalError("type="+type);
} }
...@@ -297,4 +302,8 @@ public class Set extends Prepared { ...@@ -297,4 +302,8 @@ public class Set extends Prepared {
return null; return null;
} }
public void setStringArray(String[] list) {
this.stringValueList = list;
}
} }
...@@ -16,7 +16,7 @@ public class SetTypes { ...@@ -16,7 +16,7 @@ public class SetTypes {
public static final int MAX_MEMORY_ROWS = 16, LOCK_MODE = 17, DB_CLOSE_DELAY = 18; public static final int MAX_MEMORY_ROWS = 16, LOCK_MODE = 17, DB_CLOSE_DELAY = 18;
public static final int LOG = 19, THROTTLE = 20, MAX_MEMORY_UNDO = 21, MAX_LENGTH_INPLACE_LOB = 22; public static final int LOG = 19, THROTTLE = 20, MAX_MEMORY_UNDO = 21, MAX_LENGTH_INPLACE_LOB = 22;
public static final int COMPRESS_LOB = 23, ALLOW_LITERALS = 24, MULTI_THREADED = 25, SCHEMA = 26; public static final int COMPRESS_LOB = 23, ALLOW_LITERALS = 24, MULTI_THREADED = 25, SCHEMA = 26;
public static final int OPTIMIZE_REUSE_RESULTS = 27; public static final int OPTIMIZE_REUSE_RESULTS = 27, SCHEMA_SEARCH_PATH = 28;
private static ObjectArray types = new ObjectArray(); private static ObjectArray types = new ObjectArray();
static { static {
...@@ -47,6 +47,7 @@ public class SetTypes { ...@@ -47,6 +47,7 @@ public class SetTypes {
setType(MULTI_THREADED, "MULTI_THREADED"); setType(MULTI_THREADED, "MULTI_THREADED");
setType(SCHEMA, "SCHEMA"); setType(SCHEMA, "SCHEMA");
setType(OPTIMIZE_REUSE_RESULTS, "OPTIMIZE_REUSE_RESULTS"); setType(OPTIMIZE_REUSE_RESULTS, "OPTIMIZE_REUSE_RESULTS");
setType(SCHEMA_SEARCH_PATH, "SCHEMA_SEARCH_PATH");
} }
private static void setType(int type, String name) { private static void setType(int type, String name) {
......
/*
* Copyright 2004-2006 H2 Group. Licensed under the H2 License, Version 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.engine; package org.h2.engine;
import org.h2.command.Prepared; import org.h2.command.Prepared;
......
...@@ -55,6 +55,7 @@ public class Session implements SessionInterface { ...@@ -55,6 +55,7 @@ public class Session implements SessionInterface {
private Command currentCommand; private Command currentCommand;
private boolean allowLiterals; private boolean allowLiterals;
private String currentSchemaName; private String currentSchemaName;
private String[] schemaSearchPath;
private String traceModuleName; private String traceModuleName;
private HashSet unlinkSet; private HashSet unlinkSet;
private int tempViewIndex; private int tempViewIndex;
...@@ -503,4 +504,13 @@ public class Session implements SessionInterface { ...@@ -503,4 +504,13 @@ public class Session implements SessionInterface {
} }
return (Procedure) procedures.get(name); return (Procedure) procedures.get(name);
} }
public void setSchemaSearchPath(String[] schemas) {
this.schemaSearchPath = schemas;
}
public String[] getSchemaSearchPath() {
return schemaSearchPath;
}
} }
...@@ -364,26 +364,8 @@ public class PgServerThread implements Runnable { ...@@ -364,26 +364,8 @@ public class PgServerThread implements Runnable {
} else if(s.startsWith("BEGIN")) { } else if(s.startsWith("BEGIN")) {
s = "set DATESTYLE ISO"; s = "set DATESTYLE ISO";
} }
s = StringUtils.replaceAll(s, "FROM pg_database", "FROM pg_catalog.pg_database"); int todoNeedToSupportInParser;
s = StringUtils.replaceAll(s, "FROM pg_user", "FROM pg_catalog.pg_user");
s = StringUtils.replaceAll(s, "FROM pg_settings", "FROM pg_catalog.pg_settings");
s = StringUtils.replaceAll(s, "FROM pg_database", "FROM pg_catalog.pg_database");
s = StringUtils.replaceAll(s, "JOIN pg_tablespace", "JOIN pg_catalog.pg_tablespace");
s = StringUtils.replaceAll(s, "FROM pg_tablespace", "FROM pg_catalog.pg_tablespace");
s = StringUtils.replaceAll(s, "FROM pg_class", "FROM pg_catalog.pg_class");
s = StringUtils.replaceAll(s, "from pg_class", "from pg_catalog.pg_class");
s = StringUtils.replaceAll(s, ", pg_namespace", ", pg_catalog.pg_namespace");
s = StringUtils.replaceAll(s, "JOIN pg_namespace", "JOIN pg_catalog.pg_namespace");
s = StringUtils.replaceAll(s, "FROM pg_authid", "FROM pg_catalog.pg_authid");
s = StringUtils.replaceAll(s, "from pg_type", "from pg_catalog.pg_type");
s = StringUtils.replaceAll(s, "join pg_attrdef", "join pg_catalog.pg_attrdef");
s = StringUtils.replaceAll(s, "i.indkey[ia.attnum-1]", "0"); s = StringUtils.replaceAll(s, "i.indkey[ia.attnum-1]", "0");
s = StringUtils.replaceAll(s, "current_user", "USER()");
s = StringUtils.replaceAll(s, "E'", "'"); // VALUES (E'2'[*], E'Test')
if(s.indexOf('$') > 0) {
int todoDontReplaceInQuoted;
s = s.replace('$', '?');
}
return s; return s;
} }
......
/*
* Copyright 2004-2006 H2 Group. Licensed under the H2 License, Version 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
;
drop schema if exists pg_catalog; drop schema if exists pg_catalog;
create schema pg_catalog; create schema pg_catalog;
set search_path = PUBLIC, pg_catalog;
create table pg_catalog.pg_namespace -- (oid, nspname) create table pg_catalog.pg_namespace -- (oid, nspname)
as as
select select
......
...@@ -113,7 +113,7 @@ public class DataType { ...@@ -113,7 +113,7 @@ public class DataType {
); );
add(Value.LONG, Types.BIGINT, "Long", add(Value.LONG, Types.BIGINT, "Long",
createDecimal(ValueLong.PRECISION, ValueLong.PRECISION, 0, false, true), createDecimal(ValueLong.PRECISION, ValueLong.PRECISION, 0, false, true),
new String[]{"IDENTITY"} new String[]{"IDENTITY", "SERIAL"}
); );
add(Value.DECIMAL, Types.DECIMAL, "BigDecimal", add(Value.DECIMAL, Types.DECIMAL, "BigDecimal",
createDecimal(Integer.MAX_VALUE, ValueDecimal.DEFAULT_PRECISION, ValueDecimal.DEFAULT_SCALE, true, false), createDecimal(Integer.MAX_VALUE, ValueDecimal.DEFAULT_PRECISION, ValueDecimal.DEFAULT_SCALE, true, false),
......
...@@ -555,8 +555,6 @@ public class ValueLob extends Value { ...@@ -555,8 +555,6 @@ public class ValueLob extends Value {
String[] list = FileUtils.listFiles(dir); String[] list = FileUtils.listFiles(dir);
for(int i=0; i<list.length; i++) { for(int i=0; i<list.length; i++) {
String name = list[i]; String name = list[i];
int testing;
// if(name.startsWith(prefix+ "." + tableId + ".") && name.endsWith(".lob.db")) {
if(name.startsWith(prefix+ "." + tableId + ".") && name.endsWith(".lob.db")) { if(name.startsWith(prefix+ "." + tableId + ".") && name.endsWith(".lob.db")) {
deleteFile(handler, name); deleteFile(handler, name);
} }
......
...@@ -94,6 +94,16 @@ java -Xmx512m -Xrunhprof:cpu=samples,depth=8 org.h2.tools.RunScript -url jdbc:h2 ...@@ -94,6 +94,16 @@ java -Xmx512m -Xrunhprof:cpu=samples,depth=8 org.h2.tools.RunScript -url jdbc:h2
/* /*
-- SET client_encoding = 'UTF8';
-- SET check_function_bodies = false;
-- SET client_min_messages = warning;
-- CREATE PROCEDURAL LANGUAGE plperl;
-- CREATE PROCEDURAL LANGUAGE plpgsql;
--SET search_path = public, pg_catalog;
--SET default_tablespace = '';
--SET default_with_oids = false;
--id serial NOT NULL,
pg_catalog with views pg_catalog with views
oid (object identifier) oid (object identifier)
......
...@@ -35,6 +35,7 @@ public class TestLob extends TestBase { ...@@ -35,6 +35,7 @@ public class TestLob extends TestBase {
if(config.memory) { if(config.memory) {
return; return;
} }
testLobDrop();
testLobNoClose(); testLobNoClose();
testLobTransactions(10); testLobTransactions(10);
testLobTransactions(10000); testLobTransactions(10000);
...@@ -54,6 +55,28 @@ public class TestLob extends TestBase { ...@@ -54,6 +55,28 @@ public class TestLob extends TestBase {
testJavaObject(); testJavaObject();
} }
private void testLobDrop() throws Exception {
if(config.logMode == 0 || config.networked) {
return;
}
deleteDb("lob");
Connection conn = reconnect(null);
Statement stat = conn.createStatement();
for(int i=0; i<500; i++) {
stat.execute("CREATE TABLE T"+i +"(ID INT, C CLOB)");
}
stat.execute("CREATE TABLE TEST(ID INT, C CLOB)");
stat.execute("INSERT INTO TEST VALUES(1, SPACE(10000))");
for(int i=0; i<500; i++) {
stat.execute("DROP TABLE T"+i);
}
ResultSet rs = stat.executeQuery("SELECT * FROM TEST");
while(rs.next()) {
rs.getString("C");
}
conn.close();
}
private void testLobNoClose() throws Exception { private void testLobNoClose() throws Exception {
if(config.logMode == 0 || config.networked) { if(config.logMode == 0 || config.networked) {
return; return;
......
--- special grammar and test cases --------------------------------------------------------------------------------------------- --- special grammar and test cases ---------------------------------------------------------------------------------------------
set autocommit off;
> ok
set search_path = public, information_schema;
> ok
select table_name from tables where 1=0;
> TABLE_NAME
> ----------
> rows: 0
set search_path = public;
> ok
set autocommit on;
> ok
create table script.public.x(a int); create table script.public.x(a int);
> ok > ok
......
...@@ -484,3 +484,14 @@ chdh biz inventec ...@@ -484,3 +484,14 @@ chdh biz inventec
enclosing mostly dtp scrolls cars splitting replay incomplete automate enclosing mostly dtp scrolls cars splitting replay incomplete automate
shorten shorten
attrdef resut reltuples indrelid tuple adrelid rolconfig relnamespace attname rolpassword atttypid
represented rolname indisprimary tablespace proname rolconnlimit currtid indexdef rolcreatedb
indexrelid datdba datname adnum tgnargs attnum relam userbyid typbasetype attlen rolcanlogin
rolinherit adsrc usecreatedb superuser indexprs tgfoid indisunique spcname cleartext relpages
usesuper pgdocs tginitdeferred objoid datestyle indisclustered usename datconfig tgargs resize
tgconstrrelid classoid relhasoids pretty portals rolcatupdate rolsuper spcowner typname cet typlen
latin tgconstrname datallowconn atttypmod dattablespace attrelid ctid timestamptz atthasdef
nspname objsubid typnamespace rolcreaterole tgrelid spclocation relhasrules dont indkey postmaster
relkind autovacuum datlastsysoid attisdropped amname datacl deallocate tgdeferrable stats
spcacl relname rolvaliduntil attnotnull authid aclitem
plpgsql interrupting spring oids plperl regex
\ No newline at end of file
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论