提交 99301b9e authored 作者: andrei's avatar andrei

Merge remote-tracking branch 'h2database/master' into non_blocking

# Conflicts:
#	h2/src/main/org/h2/mvstore/MVStore.java
...@@ -21,7 +21,257 @@ Change Log ...@@ -21,7 +21,257 @@ Change Log
<h2>Next Version (unreleased)</h2> <h2>Next Version (unreleased)</h2>
<ul> <ul>
<li>PR #967: Adds ARRAY_AGG() <li>PR #984: Minor refactorings in Parser
</li>
<li>Issue #933: MVStore background writer endless loop
</li>
<li>PR #981: Reorganize date-time functions
</li>
<li>PR #980: Add Parser.toString() method for improved debugging experience
</li>
<li>PR #979: Remove support of TCP protocol versions 6 and 7
</li>
<li>PR #977: Add database versions to javadoc of TCP protocol versions and update dictionary.txt
</li>
<li>PR #976: Add and use incrementDateValue() and decrementDateValue()
</li>
<li>Issue #974: Inline PRIMARY KEY definition loses its name
</li>
<li>PR #972: Add META-INF/versions to all non-Android jars that use Bits
</li>
<li>PR #971: Update ASM from 6.1-beta to 6.1
</li>
<li>PR #970: Added support for ENUM in prepared statement where clause
</li>
<li>PR #968: Assorted changes
</li>
<li>PR #967: Adds ARRAY_AGG function
</li>
<li>PR #966: Do not include help and images in client jar
</li>
<li>PR #965: Do not include mvstore.DataUtils in client jar and other changes
</li>
<li>PR #964: Fix TestFunctions.testToCharFromDateTime()
</li>
<li>PR #963 / Issue #962: Improve documentation of compatibility modes and fix ssl URL description
</li>
<li>Issue #219: H2 mode MySQL- ON UPDATE CURRENT_TIMESTAMP not supported
</li>
<li>PR #958: More fixes for PgServer
</li>
<li>PR #957: Update database size information and links in README.md
</li>
<li>PR #956: Move tests added in 821117f1db120a265647a063dca13ab5bee98efc to a proper place
</li>
<li>PR #955: Support getObject(?, Class) in generated keys
</li>
<li>PR #954: Avoid incorrect reads in iterators of TransactionMap
</li>
<li>PR #952: Optimize arguments for MVMap.init()
</li>
<li>PR #949: Fix table borders in PDF and other changes
</li>
<li>PR #948: Fix some grammar descriptions and ALTER TABLE DROP COLUMN parsing
</li>
<li>PR #947: Fix building of documentation and use modern names of Java versions
</li>
<li>PR #943: Assorted changes in documentation and a fix for current-time.sql
</li>
<li>PR #942: Fix page numbers in TOC in PDF and move System Tables into own HTML / section in PDF
</li>
<li>PR #941: Use >> syntax in median.sql and move out more tests from testScript.sql
</li>
<li>PR #940: add Support for MySQL: DROP INDEX index_name ON tbl_name
</li>
<li>PR #939: Short syntax for SQL tests
</li>
<li>Issue #935: The "date_trunc" function is not recognized for 'day'
</li>
<li>PR #936: Fix font size, line length, TOC, and many broken links in PDF
</li>
<li>PR #931: Assorted changes in documentation
</li>
<li>PR #930: Use Math.log10() and remove Mode.getOracle()
</li>
<li>PR #929: Remove Mode.supportOffsetFetch
</li>
<li>PR #928: Show information about office configuration instead of fallback PDF generation mode
</li>
<li>PR #926: Describe datetime fields in documentation
</li>
<li>PR #925: Fix time overflow in DATEADD
</li>
<li>Issue #416: Add support for DROP SCHEMA x { RESTRICT | CASCADE }
</li>
<li>PR #922: Parse and treat fractional seconds precision as described in SQL standard
</li>
<li>Issue #919: Add support for mixing adding constraints and columns in multi-add ALTER TABLE statement
</li>
<li>PR #916: Implement TABLE_CONSTRAINTS and REFERENTIAL_CONSTRAINTS from the SQL standard
</li>
<li>PR #915: Implement INFORMATION_SCHEMA.KEY_COLUMN_USAGE from SQL standard
</li>
<li>PR #914: don't allow null values in ConcurrentArrayList
</li>
<li>PR #913: Assorted changes in tests and documentation
</li>
<li>Issue #755: Missing FLOAT(precision)?
</li>
<li>PR #911: Add support for MySQL-style ALTER TABLE ADD ... FIRST
</li>
<li>Issue #409: Support derived column list syntax on table alias
</li>
<li>PR #908: remove dead code
</li>
<li>PR #907: Nest joins only if required and fix some issues with complex joins
</li>
<li>PR #906: Fix obscure error on non-standard SELECT * FROM A LEFT JOIN B NATURAL JOIN C
</li>
<li>PR #805: Move some JOIN tests from testScript.sql to own file
</li>
<li>PR #804: Remove unused parameters from readJoin() and readTableFilter()
</li>
<li>Issue #322: CSVREAD WHERE clause containing ORs duplicates number of rows
</li>
<li>PR #902: Remove DbSettings.nestedJoins
</li>
<li>PR #900: Convert duplicate anonymous classes in TableFilter to nested for reuse
</li>
<li>PR #899: Fix ON DUPLICATE KEY UPDATE for inserts with multiple rows
</li>
<li>PR #898: Parse TIME WITHOUT TIME ZONE and fix TIMESTAMP as column name
</li>
<li>PR #897: Update JTS to version 1.15.0 from LocationTech
</li>
<li>PR #896: Assorted changes in help.csv
</li>
<li>PR #895: Parse more variants of timestamps with time zones
</li>
<li>PR #893: TIMESTAMP WITHOUT TIME ZONE, TIMEZONE_HOUR, and TIMEZONE_MINUTE
</li>
<li>PR #892: Assorted minor changes in Parser
</li>
<li>PR #891: Update documentation of date-time types and clean up related code a bit
</li>
<li>PR #890: Implement conversions for TIMESTAMP WITH TIME ZONE
</li>
<li>PR #888: Fix two-phase commit in MVStore
</li>
<li>Issue #884: Wrong test Resources path in pom.xml
</li>
<li>PR #886: Fix building of documentation
</li>
<li>PR #883: Add support for TIMESTAMP WITH TIME ZONE to FORMATDATETIME
</li>
<li>PR #881: Reimplement dateValueFromDate() and nanosFromDate() without a Calendar
</li>
<li>PR #880: Assorted date-time related changes
</li>
<li>PR #879: Reimplement TO_DATE without a Calendar and fix a lot of bugs an incompatibilities
</li>
<li>PR #878: Fix IYYY in TO_CHAR and reimplement TRUNCATE without a Calendar
</li>
<li>PR #877: Reimplement TO_CHAR without a Calendar and fix 12 AM / 12 PM in it
</li>
<li>PR #876: Test out of memory
</li>
<li>PR #875: Improve date-time related parts of documentation
</li>
<li>PR #872: Assorted date-time related changes
</li>
<li>PR #871: Fix OOME in Transfer.readValue() with large CLOB V2
</li>
<li>PR #867: TestOutOfMemory stability
</li>
<li>Issue #834: Add support for the SQL standard FILTER clause on aggregate functions
</li>
<li>PR #864: Minor changes in DateUtils and Function
</li>
<li>PR #863: Polish: use isEmpty() to check whether the collection is empty or not.
</li>
<li>PR #862: Convert constraint type into enum
</li>
<li>PR #861: Avoid resource leak
</li>
<li>PR #860: IndexCursor inList
</li>
<li>PR #858 / Issue #690 and others: Return all generated rows and columns from getGeneratedKeys()
</li>
<li>Make the JDBC client independent of the database engine
</li>
<li>PR #857: Do not write each SQL error multiple times in TestScript
</li>
<li>PR #856: Fix TestDateTimeUtils.testDayOfWeek() and example with ANY(?
</li>
<li>PR #855: Reimplement DATEADD without a Calendar and fix some incompatibilities
</li>
<li>PR #854: Improve test stability
</li>
<li>PR #851: Reimplement DATEDIFF without a Calendar
</li>
<li>Issue #502: SQL "= ANY (?)" supported?
</li>
<li>PR #849: Encode date and time in fast and proper way in PgServerThread
</li>
<li>PR #847: Reimplement remaining parts of EXTRACT, ISO_YEAR, etc without a Calendar
</li>
<li>PR #846: Read known fields directly in DateTimeUtils.getDatePart()
</li>
<li>Issue #832: Extract EPOCH from a timestamp
</li>
<li>PR #844: Add simple implementations of isWrapperFor() and unwrap() to JdbcDataSource
</li>
<li>PR #843: Add MEDIAN to help.csv and fix building of documentation
</li>
<li>PR #841: Support indexes with nulls last for MEDIAN aggregate
</li>
<li>PR #840: Add MEDIAN aggregate
</li>
<li>PR #839: TestTools should not leave testing thread in interrupted state
</li>
<li>PR #838: (tests) Excessive calls to Runtime.getRuntime().gc() cause OOM for no reason
</li>
<li>Don't use substring when doing StringBuffer#append
</li>
<li>PR #837: Use StringUtils.replaceAll() in Function.replace()
</li>
<li>PR #836: Allow to read invalid February 29 dates with LocalDate as March 1
</li>
<li>PR #835: Inline getTimeTry() into DateTimeUtils.getMillis()
</li>
<li>PR #827: Use dateValueFromDate() and nanosFromDate() in parseTimestamp()
</li>
<li>Issue #115: to_char fails with pattern FM0D099
</li>
<li>PR #825: Merge code for parsing and formatting timestamp values
</li>
<li>Enums for ConstraintActionType, UnionType, and OpType
</li>
<li>PR 824: Add partial support for INSERT IGNORE in MySQL mode
</li>
<li>PR #823: Use ValueByte.getInt() and ValueShort.getInt() in convertTo()
</li>
<li>PR #820: Fix some compiler warnings
</li>
<li>PR #818: Fixes for remaining issues with boolean parameters
</li>
<li>Use enum for file lock method
</li>
<li>PR #817: Parse also 1 as true and 0 as false in Utils.parseBoolean()
</li>
<li>PR #815: Fix count of completed statements
</li>
<li>PR #814: Method.isVarArgs() is available on all supported platforms
</li>
<li>Issue #812: TIME values should be in range 0:00:00.000000000 23:59:59.999999999?
</li>
<li>PR #811: Issues with Boolean.parseBoolean()
</li>
<li>PR #809: Use type constants from LocalDateTimeUtils directly
</li>
<li>PR #808: Use HmacSHA256 provided by JRE
</li>
<li>PR #807: Use SHA-256 provided by JRE / Android and use rotateLeft / Right in Fog
</li> </li>
<li>PR #806: Implement setBytes() and setString() with offset and len <li>PR #806: Implement setBytes() and setString() with offset and len
</li> </li>
......
...@@ -79,47 +79,63 @@ The following can be skipped currently; benchmarks should probably be removed: ...@@ -79,47 +79,63 @@ The following can be skipped currently; benchmarks should probably be removed:
## Build the Release ## Build the Release
Change directory to src/installer Run the following commands:
Run ./buildRelease.sh (non-Windows) or buildRelease.bat (Windows) Non-Windows:
Scan for viruses cd src/installer
Test installer, H2 Console (test new languages) ./buildRelease.sh
Check docs, versions and links in main, downloads, build numbers
Check the PDF file size Windows:
cd src/installer
buildRelease.bat
Scan for viruses.
Test installer, H2 Console (test new languages).
Check docs, versions and links in main, downloads, build numbers.
Check the PDF file size.
Upload (http and https) to ftp://h2database.com/javadoc Upload (http and https) to ftp://h2database.com/javadoc
Upload (http and https) to ftp://h2database.com Upload (http and https) to ftp://h2database.com
Upload (http and https) to ftp://h2database.com/m2-repo Upload (http and https) to ftp://h2database.com/m2-repo
Github: create a release
Newsletter: prepare (always to BCC) Github: create a release.
Newsletter: send to h2-database-jp@googlegroups.com; h2-database@googlegroups.com; h2database-news@googlegroups.com; ...
Add to http://twitter.com Newsletter: send (always to BCC!), the following:
- tweet: add @geospatialnews for the new geometry type and disk spatial index
h2-database-jp@googlegroups.com; h2-database@googlegroups.com; h2database-news@googlegroups.com; ...
Create tweet at http://twitter.com
Sign files and publish files on Maven Central Sign files and publish files on Maven Central
(check java version is 1.7) (check java version is 1.7)
./build.sh clean compile jar mavenDeployCentral
cd /data/h2database/m2-repo/com/h2database ./build.sh clean compile jar mavenDeployCentral
# remove sha and md5 files: cd /data/h2database/m2-repo/com/h2database
find . -name "*.sha1" -delete # remove sha and md5 files:
find . -name "*.md5" -delete find . -name "*.sha1" -delete
cd h2/1... find . -name "*.md5" -delete
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom): cd h2/1...
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-... # for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
jar -cvf bundle.jar h2-* gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-...
cd ../../h2-mvstore/1... jar -cvf bundle.jar h2-*
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom): cd ../../h2-mvstore/1...
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-mvstore... # for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
jar -cvf bundle.jar h2-* gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-mvstore...
# http://central.sonatype.org/pages/ossrh-guide.html jar -cvf bundle.jar h2-*
# http://central.sonatype.org/pages/manual-staging-bundle-creation-and-deployment.html # http://central.sonatype.org/pages/ossrh-guide.html
# https://oss.sonatype.org/#welcome - Log In "t..." # http://central.sonatype.org/pages/manual-staging-bundle-creation-and-deployment.html
# - Staging Upload # https://oss.sonatype.org/#welcome - Log In "t..."
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar # - Staging Upload
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm # - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Staging Upload # - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar # - Staging Upload
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm # - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
Update statistics
Change version in pom.xml, commit Update statistics.
Change version in pom.xml, commit.
...@@ -147,6 +147,7 @@ import org.h2.table.Table; ...@@ -147,6 +147,7 @@ import org.h2.table.Table;
import org.h2.table.TableFilter; import org.h2.table.TableFilter;
import org.h2.table.TableFilter.TableFilterVisitor; import org.h2.table.TableFilter.TableFilterVisitor;
import org.h2.table.TableView; import org.h2.table.TableView;
import org.h2.util.DateTimeFunctions;
import org.h2.util.MathUtils; import org.h2.util.MathUtils;
import org.h2.util.New; import org.h2.util.New;
import org.h2.util.ParserUtil; import org.h2.util.ParserUtil;
...@@ -793,8 +794,7 @@ public class Parser { ...@@ -793,8 +794,7 @@ public class Parser {
do { do {
Column column = readTableColumn(filter); Column column = readTableColumn(filter);
columns.add(column); columns.add(column);
} while (readIf(",")); } while (readIfMore(true));
read(")");
read("="); read("=");
Expression expression = readExpression(); Expression expression = readExpression();
if (columns.size() == 1) { if (columns.size() == 1) {
...@@ -905,8 +905,7 @@ public class Parser { ...@@ -905,8 +905,7 @@ public class Parser {
column.sortType |= SortOrder.NULLS_LAST; column.sortType |= SortOrder.NULLS_LAST;
} }
} }
} while (readIf(",")); } while (readIfMore(true));
read(")");
return columns.toArray(new IndexColumn[0]); return columns.toArray(new IndexColumn[0]);
} }
...@@ -915,7 +914,7 @@ public class Parser { ...@@ -915,7 +914,7 @@ public class Parser {
do { do {
String columnName = readColumnIdentifier(); String columnName = readColumnIdentifier();
columns.add(columnName); columns.add(columnName);
} while (readIfMore()); } while (readIfMore(false));
return columns.toArray(new String[0]); return columns.toArray(new String[0]);
} }
...@@ -930,7 +929,7 @@ public class Parser { ...@@ -930,7 +929,7 @@ public class Parser {
column.getSQL()); column.getSQL());
} }
columns.add(column); columns.add(column);
} while (readIfMore()); } while (readIfMore(false));
} }
return columns.toArray(new Column[0]); return columns.toArray(new Column[0]);
} }
...@@ -943,9 +942,16 @@ public class Parser { ...@@ -943,9 +942,16 @@ public class Parser {
return table.getColumn(id); return table.getColumn(id);
} }
private boolean readIfMore() { /**
* Read comma or closing brace.
*
* @param strict
* if {@code false} additional comma before brace is allowed
* @return {@code true} if comma is read, {@code false} if brace is read
*/
private boolean readIfMore(boolean strict) {
if (readIf(",")) { if (readIf(",")) {
return !readIf(")"); return strict || !readIf(")");
} }
read(")"); read(")");
return false; return false;
...@@ -1109,7 +1115,7 @@ public class Parser { ...@@ -1109,7 +1115,7 @@ public class Parser {
} else { } else {
values.add(readExpression()); values.add(readExpression());
} }
} while (readIfMore()); } while (readIfMore(false));
} }
command.addRow(values.toArray(new Expression[0])); command.addRow(values.toArray(new Expression[0]));
} while (readIf(",")); } while (readIf(","));
...@@ -1280,7 +1286,7 @@ public class Parser { ...@@ -1280,7 +1286,7 @@ public class Parser {
} else { } else {
values.add(readExpression()); values.add(readExpression());
} }
} while (readIfMore()); } while (readIfMore(false));
} }
command.addRow(values.toArray(new Expression[0])); command.addRow(values.toArray(new Expression[0]));
// the following condition will allow (..),; and (..); // the following condition will allow (..),; and (..);
...@@ -1339,7 +1345,7 @@ public class Parser { ...@@ -1339,7 +1345,7 @@ public class Parser {
} else { } else {
values.add(readExpression()); values.add(readExpression());
} }
} while (readIfMore()); } while (readIfMore(false));
} }
command.addRow(values.toArray(new Expression[0])); command.addRow(values.toArray(new Expression[0]));
} while (readIf(",")); } while (readIf(","));
...@@ -1475,8 +1481,7 @@ public class Parser { ...@@ -1475,8 +1481,7 @@ public class Parser {
String indexName = readIdentifierWithSchema(); String indexName = readIdentifierWithSchema();
Index index = table.getIndex(indexName); Index index = table.getIndex(indexName);
indexNames.add(index.getName()); indexNames.add(index.getName());
} while (readIf(",")); } while (readIfMore(true));
read(")");
} }
return IndexHints.createUseIndexHints(indexNames); return IndexHints.createUseIndexHints(indexNames);
} }
...@@ -1502,8 +1507,7 @@ public class Parser { ...@@ -1502,8 +1507,7 @@ public class Parser {
ArrayList<String> derivedColumnNames = New.arrayList(); ArrayList<String> derivedColumnNames = New.arrayList();
do { do {
derivedColumnNames.add(readAliasIdentifier()); derivedColumnNames.add(readAliasIdentifier());
} while (readIf(",")); } while (readIfMore(true));
read(")");
return derivedColumnNames; return derivedColumnNames;
} }
return null; return null;
...@@ -2721,8 +2725,7 @@ public class Parser { ...@@ -2721,8 +2725,7 @@ public class Parser {
ArrayList<Expression> params = New.arrayList(); ArrayList<Expression> params = New.arrayList();
do { do {
params.add(readExpression()); params.add(readExpression());
} while (readIf(",")); } while (readIfMore(true));
read(")");
Expression filterCondition; Expression filterCondition;
if (readIf("FILTER")) { if (readIf("FILTER")) {
read("("); read("(");
...@@ -2798,7 +2801,7 @@ public class Parser { ...@@ -2798,7 +2801,7 @@ public class Parser {
} }
case Function.DATE_ADD: case Function.DATE_ADD:
case Function.DATE_DIFF: { case Function.DATE_DIFF: {
if (Function.isDatePart(currentToken)) { if (DateTimeFunctions.isDatePart(currentToken)) {
function.setParameter(0, function.setParameter(0,
ValueExpression.get(ValueString.get(currentToken))); ValueExpression.get(ValueString.get(currentToken)));
read(); read();
...@@ -2893,8 +2896,7 @@ public class Parser { ...@@ -2893,8 +2896,7 @@ public class Parser {
read("="); read("=");
function.setParameter(i, readExpression()); function.setParameter(i, readExpression());
i++; i++;
} while (readIf(",")); } while (readIfMore(true));
read(")");
TableFunction tf = (TableFunction) function; TableFunction tf = (TableFunction) function;
tf.setColumns(columns); tf.setColumns(columns);
break; break;
...@@ -2914,8 +2916,7 @@ public class Parser { ...@@ -2914,8 +2916,7 @@ public class Parser {
int i = 0; int i = 0;
do { do {
function.setParameter(i++, readExpression()); function.setParameter(i++, readExpression());
} while (readIf(",")); } while (readIfMore(true));
read(")");
} }
} }
function.doneWithParameters(); function.doneWithParameters();
...@@ -3541,19 +3542,6 @@ public class Parser { ...@@ -3541,19 +3542,6 @@ public class Parser {
return false; return false;
} }
/*
* Reads passed token in list, in order and returns true on first match.
* If none of the token matches returns false
*/
private boolean readIfOr(String... tokens) {
for (String token: tokens) {
if (readIf(token)) {
return true;
}
}
return false;
}
/* /*
* Reads every token in list, in order - returns true if all are found. * Reads every token in list, in order - returns true if all are found.
* If any are not found, returns false - AND resets parsing back to state when called. * If any are not found, returns false - AND resets parsing back to state when called.
...@@ -4153,40 +4141,39 @@ public class Parser { ...@@ -4153,40 +4141,39 @@ public class Parser {
break; break;
} }
} else if (s.length() == 2) { } else if (s.length() == 2) {
char c1 = s.charAt(1);
switch (c0) { switch (c0) {
case ':': case ':':
if ("::".equals(s)) { if (c1 == ':' || c1 == '=') {
return KEYWORD;
} else if (":=".equals(s)) {
return KEYWORD; return KEYWORD;
} }
break; break;
case '>': case '>':
if (">=".equals(s)) { if (c1 == '=') {
return BIGGER_EQUAL; return BIGGER_EQUAL;
} }
break; break;
case '<': case '<':
if ("<=".equals(s)) { if (c1 == '=') {
return SMALLER_EQUAL; return SMALLER_EQUAL;
} else if ("<>".equals(s)) { } else if (c1 == '>') {
return NOT_EQUAL; return NOT_EQUAL;
} }
break; break;
case '!': case '!':
if ("!=".equals(s)) { if (c1 == '=') {
return NOT_EQUAL; return NOT_EQUAL;
} else if ("!~".equals(s)) { } else if (c1 == '~') {
return KEYWORD; return KEYWORD;
} }
break; break;
case '|': case '|':
if ("||".equals(s)) { if (c1 == '|') {
return STRING_CONCAT; return STRING_CONCAT;
} }
break; break;
case '&': case '&':
if ("&&".equals(s)) { if (c1 == '&') {
return SPATIAL_INTERSECTS; return SPATIAL_INTERSECTS;
} }
break; break;
...@@ -4490,7 +4477,9 @@ public class Parser { ...@@ -4490,7 +4477,9 @@ public class Parser {
} }
original += "(" + p; original += "(" + p;
// Oracle syntax // Oracle syntax
readIfOr("CHAR", "BYTE"); if (!readIf("CHAR")) {
readIf("BYTE");
}
if (dataType.supportsScale) { if (dataType.supportsScale) {
if (readIf(",")) { if (readIf(",")) {
scale = readInt(); scale = readInt();
...@@ -4524,13 +4513,12 @@ public class Parser { ...@@ -4524,13 +4513,12 @@ public class Parser {
String enumerator0 = readString(); String enumerator0 = readString();
enumeratorList.add(enumerator0); enumeratorList.add(enumerator0);
original += "'" + enumerator0 + "'"; original += "'" + enumerator0 + "'";
while (readIf(",")) { while (readIfMore(true)) {
original += ','; original += ',';
String enumeratorN = readString(); String enumeratorN = readString();
original += "'" + enumeratorN + "'"; original += "'" + enumeratorN + "'";
enumeratorList.add(enumeratorN); enumeratorList.add(enumeratorN);
} }
read(")");
original += ')'; original += ')';
enumerators = enumeratorList.toArray(new String[0]); enumerators = enumeratorList.toArray(new String[0]);
} }
...@@ -4850,10 +4838,7 @@ public class Parser { ...@@ -4850,10 +4838,7 @@ public class Parser {
columns.set(i, column); columns.set(i, column);
row.add(expr); row.add(expr);
i++; i++;
} while (multiColumn && readIf(",")); } while (multiColumn && readIfMore(true));
if (multiColumn) {
read(")");
}
rows.add(row); rows.add(row);
} while (readIf(",")); } while (readIf(","));
int columnCount = columns.size(); int columnCount = columns.size();
...@@ -6357,8 +6342,7 @@ public class Parser { ...@@ -6357,8 +6342,7 @@ public class Parser {
command.setIfNotExists(false); command.setIfNotExists(false);
do { do {
parseTableColumnDefinition(command, schema, tableName); parseTableColumnDefinition(command, schema, tableName);
} while (readIf(",")); } while (readIfMore(true));
read(")");
} else { } else {
boolean ifNotExists = readIfNotExists(); boolean ifNotExists = readIfNotExists();
command.setIfNotExists(ifNotExists); command.setIfNotExists(ifNotExists);
...@@ -6609,7 +6593,7 @@ public class Parser { ...@@ -6609,7 +6593,7 @@ public class Parser {
if (!readIf(")")) { if (!readIf(")")) {
do { do {
parseTableColumnDefinition(command, schema, tableName); parseTableColumnDefinition(command, schema, tableName);
} while (readIfMore()); } while (readIfMore(false));
} }
} }
// Allows "COMMENT='comment'" in DDL statements (MySQL syntax) // Allows "COMMENT='comment'" in DDL statements (MySQL syntax)
......
...@@ -84,6 +84,9 @@ public class AlterTableAlterColumn extends CommandWithColumns { ...@@ -84,6 +84,9 @@ public class AlterTableAlterColumn extends CommandWithColumns {
this.oldColumn = oldColumn; this.oldColumn = oldColumn;
} }
/**
* Add the column as the first column of the table.
*/
public void setAddFirst() { public void setAddFirst() {
addFirst = true; addFirst = true;
} }
......
...@@ -60,6 +60,12 @@ public abstract class CommandWithColumns extends SchemaCommand { ...@@ -60,6 +60,12 @@ public abstract class CommandWithColumns extends SchemaCommand {
} }
} }
/**
* For the given list of columns, disable "nullable" for those columns that
* are primary key columns.
*
* @param columns the list of columns
*/
protected void changePrimaryKeysToNotNull(ArrayList<Column> columns) { protected void changePrimaryKeysToNotNull(ArrayList<Column> columns) {
if (pkColumns != null) { if (pkColumns != null) {
for (Column c : columns) { for (Column c : columns) {
...@@ -72,6 +78,9 @@ public abstract class CommandWithColumns extends SchemaCommand { ...@@ -72,6 +78,9 @@ public abstract class CommandWithColumns extends SchemaCommand {
} }
} }
/**
* Create the constraints.
*/
protected void createConstraints() { protected void createConstraints() {
if (constraintCommands != null) { if (constraintCommands != null) {
for (DefineCommand command : constraintCommands) { for (DefineCommand command : constraintCommands) {
...@@ -81,6 +90,15 @@ public abstract class CommandWithColumns extends SchemaCommand { ...@@ -81,6 +90,15 @@ public abstract class CommandWithColumns extends SchemaCommand {
} }
} }
/**
* For the given list of columns, create sequences for auto-increment
* columns (if needed), and then get the list of all sequences of the
* columns.
*
* @param columns the columns
* @param temporary whether generated sequences should be temporary
* @return the list of sequences (may be empty)
*/
protected ArrayList<Sequence> generateSequences(ArrayList<Column> columns, boolean temporary) { protected ArrayList<Sequence> generateSequences(ArrayList<Column> columns, boolean temporary) {
ArrayList<Sequence> sequences = New.arrayList(); ArrayList<Sequence> sequences = New.arrayList();
if (columns != null) { if (columns != null) {
......
...@@ -12,7 +12,9 @@ import org.h2.result.SearchRow; ...@@ -12,7 +12,9 @@ import org.h2.result.SearchRow;
import org.h2.value.Value; import org.h2.value.Value;
/** /**
* Abstract function cursor. * Abstract function cursor. This implementation filters the rows (only returns
* entries that are larger or equal to "first", and smaller than last or equal
* to "last").
*/ */
abstract class AbstractFunctionCursor implements Cursor { abstract class AbstractFunctionCursor implements Cursor {
private final FunctionIndex index; private final FunctionIndex index;
...@@ -85,6 +87,11 @@ abstract class AbstractFunctionCursor implements Cursor { ...@@ -85,6 +87,11 @@ abstract class AbstractFunctionCursor implements Cursor {
return false; return false;
} }
/**
* Skip to the next row if one is available. This method does not filter.
*
* @return true if another row is available
*/
abstract boolean nextImpl(); abstract boolean nextImpl();
@Override @Override
......
...@@ -1329,6 +1329,7 @@ public class JdbcPreparedStatement extends JdbcStatement implements ...@@ -1329,6 +1329,7 @@ public class JdbcPreparedStatement extends JdbcStatement implements
Value[] set = new Value[size]; Value[] set = new Value[size];
for (int i = 0; i < size; i++) { for (int i = 0; i < size; i++) {
ParameterInterface param = parameters.get(i); ParameterInterface param = parameters.get(i);
param.checkSet();
Value value = param.getParamValue(); Value value = param.getParamValue();
set[i] = value; set[i] = value;
} }
......
...@@ -354,7 +354,7 @@ public final class MVStore { ...@@ -354,7 +354,7 @@ public final class MVStore {
int kb = DataUtils.getConfigParam(config, "autoCommitBufferSize", 1024); int kb = DataUtils.getConfigParam(config, "autoCommitBufferSize", 1024);
// 19 KB memory is about 1 KB storage // 19 KB memory is about 1 KB storage
autoCommitMemory = kb * 1024 * 19; autoCommitMemory = kb * 1024 * 19;
autoCompactFillRate = DataUtils.getConfigParam(config, "autoCompactFillRate", 50); autoCompactFillRate = DataUtils.getConfigParam(config, "autoCompactFillRate", 40);
char[] encryptionKey = (char[]) config.get("encryptionKey"); char[] encryptionKey = (char[]) config.get("encryptionKey");
try { try {
if (!fileStoreIsProvided) { if (!fileStoreIsProvided) {
...@@ -1083,14 +1083,7 @@ public final class MVStore { ...@@ -1083,14 +1083,7 @@ public final class MVStore {
private void storeNow() { private void storeNow() {
assert Thread.holdsLock(this); assert Thread.holdsLock(this);
long time = getTimeSinceCreation(); long time = getTimeSinceCreation();
int freeDelay = retentionTime / 10; freeUnusedIfNeeded(time);
if (time >= lastFreeUnusedChunks + freeDelay) {
// set early in case it fails (out of memory or so)
lastFreeUnusedChunks = time;
freeUnusedChunks();
// set it here as well, to avoid calling it often if it was slow
lastFreeUnusedChunks = getTimeSinceCreation();
}
int currentUnsavedPageCount = unsavedMemory; int currentUnsavedPageCount = unsavedMemory;
long storeVersion = currentStoreVersion; long storeVersion = currentStoreVersion;
long version = ++currentVersion; long version = ++currentVersion;
...@@ -1125,7 +1118,6 @@ public final class MVStore { ...@@ -1125,7 +1118,6 @@ public final class MVStore {
} }
} }
Chunk c = new Chunk(newChunkId); Chunk c = new Chunk(newChunkId);
c.pageCount = Integer.MAX_VALUE; c.pageCount = Integer.MAX_VALUE;
c.pageCountLive = Integer.MAX_VALUE; c.pageCountLive = Integer.MAX_VALUE;
c.maxLen = Long.MAX_VALUE; c.maxLen = Long.MAX_VALUE;
...@@ -1280,6 +1272,21 @@ public final class MVStore { ...@@ -1280,6 +1272,21 @@ public final class MVStore {
lastStoredVersion = storeVersion; lastStoredVersion = storeVersion;
} }
/**
* Try to free unused chunks. This method doesn't directly write, but can
* change the metadata, and therefore cause a background write.
*/
private void freeUnusedIfNeeded(long time) {
int freeDelay = retentionTime / 5;
if (time >= lastFreeUnusedChunks + freeDelay) {
// set early in case it fails (out of memory or so)
lastFreeUnusedChunks = time;
freeUnusedChunks();
// set it here as well, to avoid calling it often if it was slow
lastFreeUnusedChunks = getTimeSinceCreation();
}
}
private synchronized void freeUnusedChunks() { private synchronized void freeUnusedChunks() {
if (lastChunk != null && reuseSpace) { if (lastChunk != null && reuseSpace) {
Set<Integer> referenced = collectReferencedChunks(); Set<Integer> referenced = collectReferencedChunks();
...@@ -1574,11 +1581,11 @@ public final class MVStore { ...@@ -1574,11 +1581,11 @@ public final class MVStore {
*/ */
private long getFileLengthInUse() { private long getFileLengthInUse() {
long result = fileStore.getFileLengthInUse(); long result = fileStore.getFileLengthInUse();
assert result == _getFileLengthInUse() : result + " != " + _getFileLengthInUse(); assert result == measureFileLengthInUse() : result + " != " + measureFileLengthInUse();
return result; return result;
} }
private long _getFileLengthInUse() { private long measureFileLengthInUse() {
long size = 2; long size = 2;
for (Chunk c : chunks.values()) { for (Chunk c : chunks.values()) {
if (c.len != Integer.MAX_VALUE) { if (c.len != Integer.MAX_VALUE) {
...@@ -1842,6 +1849,39 @@ public final class MVStore { ...@@ -1842,6 +1849,39 @@ public final class MVStore {
} }
} }
/**
* Get the current fill rate (percentage of used space in the file). Unlike
* the fill rate of the store, here we only account for chunk data; the fill
* rate here is how much of the chunk data is live (still referenced). Young
* chunks are considered live.
*
* @return the fill rate, in percent (100 is completely full)
*/
public int getCurrentFillRate() {
long maxLengthSum = 1;
long maxLengthLiveSum = 1;
long time = getTimeSinceCreation();
for (Chunk c : chunks.values()) {
maxLengthSum += c.maxLen;
if (c.time + retentionTime > time) {
// young chunks (we don't optimize those):
// assume if they are fully live
// so that we don't try to optimize yet
// until they get old
maxLengthLiveSum += c.maxLen;
} else {
maxLengthLiveSum += c.maxLenLive;
}
}
// the fill rate of all chunks combined
if (maxLengthSum <= 0) {
// avoid division by 0
maxLengthSum = 1;
}
int fillRate = (int) (100 * maxLengthLiveSum / maxLengthSum);
return fillRate;
}
private ArrayList<Chunk> findOldChunks(int targetFillRate, int write) { private ArrayList<Chunk> findOldChunks(int targetFillRate, int write) {
if (lastChunk == null) { if (lastChunk == null) {
// nothing to do // nothing to do
...@@ -2543,10 +2583,8 @@ public final class MVStore { ...@@ -2543,10 +2583,8 @@ public final class MVStore {
fileOps = false; fileOps = false;
} }
// use a lower fill rate if there were any file operations // use a lower fill rate if there were any file operations
int fillRate = fileOps ? autoCompactFillRate / 3 : autoCompactFillRate; int targetFillRate = fileOps ? autoCompactFillRate / 3 : autoCompactFillRate;
// TODO how to avoid endless compaction if there is a bug compact(targetFillRate, autoCommitMemory);
// in the bookkeeping?
compact(fillRate, autoCommitMemory);
autoCompactLastFileOpCount = fileStore.getWriteCount() + fileStore.getReadCount(); autoCompactLastFileOpCount = fileStore.getWriteCount() + fileStore.getReadCount();
} }
} catch (Throwable e) { } catch (Throwable e) {
...@@ -2915,7 +2953,7 @@ public final class MVStore { ...@@ -2915,7 +2953,7 @@ public final class MVStore {
* this value, then chunks at the end of the file are moved. Compaction * this value, then chunks at the end of the file are moved. Compaction
* stops if the target fill rate is reached. * stops if the target fill rate is reached.
* <p> * <p>
* The default value is 50 (50%). The value 0 disables auto-compacting. * The default value is 40 (40%). The value 0 disables auto-compacting.
* <p> * <p>
* *
* @param percent the target fill rate * @param percent the target fill rate
......
...@@ -96,6 +96,7 @@ public class StreamStore { ...@@ -96,6 +96,7 @@ public class StreamStore {
* @param in the stream * @param in the stream
* @return the id (potentially an empty array) * @return the id (potentially an empty array)
*/ */
@SuppressWarnings("resource")
public byte[] put(InputStream in) throws IOException { public byte[] put(InputStream in) throws IOException {
ByteArrayOutputStream id = new ByteArrayOutputStream(); ByteArrayOutputStream id = new ByteArrayOutputStream();
int level = 0; int level = 0;
......
差异被折叠。
...@@ -9,12 +9,9 @@ package org.h2.util; ...@@ -9,12 +9,9 @@ package org.h2.util;
import java.sql.Date; import java.sql.Date;
import java.sql.Time; import java.sql.Time;
import java.sql.Timestamp; import java.sql.Timestamp;
import java.text.SimpleDateFormat;
import java.util.Calendar; import java.util.Calendar;
import java.util.GregorianCalendar; import java.util.GregorianCalendar;
import java.util.Locale;
import java.util.TimeZone; import java.util.TimeZone;
import org.h2.api.ErrorCode;
import org.h2.engine.Mode; import org.h2.engine.Mode;
import org.h2.message.DbException; import org.h2.message.DbException;
import org.h2.value.Value; import org.h2.value.Value;
...@@ -58,7 +55,7 @@ public class DateTimeUtils { ...@@ -58,7 +55,7 @@ public class DateTimeUtils {
/** /**
* Date value for 1970-01-01. * Date value for 1970-01-01.
*/ */
private static final int EPOCH_DATE_VALUE = (1970 << SHIFT_YEAR) + (1 << SHIFT_MONTH) + 1; public static final int EPOCH_DATE_VALUE = (1970 << SHIFT_YEAR) + (1 << SHIFT_MONTH) + 1;
private static final int[] NORMAL_DAYS_PER_MONTH = { 0, 31, 28, 31, 30, 31, private static final int[] NORMAL_DAYS_PER_MONTH = { 0, 31, 28, 31, 30, 31,
30, 31, 31, 30, 31, 30, 31 }; 30, 31, 31, 30, 31, 30, 31 };
...@@ -744,8 +741,7 @@ public class DateTimeUtils { ...@@ -744,8 +741,7 @@ public class DateTimeUtils {
* @return number of day in year * @return number of day in year
*/ */
public static int getDayOfYear(long dateValue) { public static int getDayOfYear(long dateValue) {
int year = yearFromDateValue(dateValue); return (int) (absoluteDayFromDateValue(dateValue) - absoluteDayFromYear(yearFromDateValue(dateValue))) + 1;
return (int) (absoluteDayFromDateValue(dateValue) - absoluteDayFromDateValue(dateValue(year, 1, 1))) + 1;
} }
/** /**
...@@ -825,7 +821,7 @@ public class DateTimeUtils { ...@@ -825,7 +821,7 @@ public class DateTimeUtils {
} }
private static long getWeekOfYearBase(int year, int firstDayOfWeek, int minimalDaysInFirstWeek) { private static long getWeekOfYearBase(int year, int firstDayOfWeek, int minimalDaysInFirstWeek) {
long first = absoluteDayFromDateValue(dateValue(year, 1, 1)); long first = absoluteDayFromYear(year);
int daysInFirstWeek = 8 - getDayOfWeekFromAbsolute(first, firstDayOfWeek); int daysInFirstWeek = 8 - getDayOfWeekFromAbsolute(first, firstDayOfWeek);
long base = first + daysInFirstWeek; long base = first + daysInFirstWeek;
if (daysInFirstWeek >= minimalDaysInFirstWeek) { if (daysInFirstWeek >= minimalDaysInFirstWeek) {
...@@ -860,67 +856,6 @@ public class DateTimeUtils { ...@@ -860,67 +856,6 @@ public class DateTimeUtils {
return year; return year;
} }
/**
* Formats a date using a format string.
*
* @param date the date to format
* @param format the format string
* @param locale the locale
* @param timeZone the timezone
* @return the formatted date
*/
public static String formatDateTime(java.util.Date date, String format,
String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
synchronized (dateFormat) {
return dateFormat.format(date);
}
}
/**
* Parses a date using a format string.
*
* @param date the date to parse
* @param format the parsing format
* @param locale the locale
* @param timeZone the timeZone
* @return the parsed date
*/
public static java.util.Date parseDateTime(String date, String format,
String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
try {
synchronized (dateFormat) {
return dateFormat.parse(date);
}
} catch (Exception e) {
// ParseException
throw DbException.get(ErrorCode.PARSE_ERROR_1, e, date);
}
}
private static SimpleDateFormat getDateFormat(String format, String locale,
String timeZone) {
try {
// currently, a new instance is create for each call
// however, could cache the last few instances
SimpleDateFormat df;
if (locale == null) {
df = new SimpleDateFormat(format);
} else {
Locale l = new Locale(locale);
df = new SimpleDateFormat(format, l);
}
if (timeZone != null) {
df.setTimeZone(TimeZone.getTimeZone(timeZone));
}
return df;
} catch (Exception e) {
throw DbException.get(ErrorCode.PARSE_ERROR_1, e,
format + "/" + locale + "/" + timeZone);
}
}
/** /**
* Returns number of days in month. * Returns number of days in month.
* *
...@@ -1230,6 +1165,26 @@ public class DateTimeUtils { ...@@ -1230,6 +1165,26 @@ public class DateTimeUtils {
return ValueTimestampTimeZone.fromDateValueAndNanos(dateValue, timeNanos, (short) offsetMins); return ValueTimestampTimeZone.fromDateValueAndNanos(dateValue, timeNanos, (short) offsetMins);
} }
/**
* Calculate the absolute day for a January, 1 of the specified year.
*
* @param year
* the year
* @return the absolute day
*/
public static long absoluteDayFromYear(long year) {
year--;
long a = ((year * 1461L) >> 2) - 719_177;
if (year < 1582) {
// Julian calendar
a += 13;
} else if (year < 1900 || year > 2099) {
// Gregorian calendar (slow mode)
a += (year / 400) - (year / 100) + 15;
}
return a;
}
/** /**
* Calculate the absolute day from an encoded date value. * Calculate the absolute day from an encoded date value.
* *
...@@ -1244,11 +1199,11 @@ public class DateTimeUtils { ...@@ -1244,11 +1199,11 @@ public class DateTimeUtils {
y--; y--;
m += 12; m += 12;
} }
long a = ((y * 2922L) >> 3) + DAYS_OFFSET[m - 3] + d - 719_484; long a = ((y * 1461L) >> 2) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y <= 1582 && ((y < 1582) || (m * 100 + d < 1015))) { if (y <= 1582 && ((y < 1582) || (m * 100 + d < 10_15))) {
// Julian calendar (cutover at 1582-10-04 / 1582-10-15) // Julian calendar (cutover at 1582-10-04 / 1582-10-15)
a += 13; a += 13;
} else if (y < 1901 || y > 2099) { } else if (y < 1900 || y > 2099) {
// Gregorian calendar (slow mode) // Gregorian calendar (slow mode)
a += (y / 400) - (y / 100) + 15; a += (y / 400) - (y / 100) + 15;
} }
...@@ -1270,8 +1225,8 @@ public class DateTimeUtils { ...@@ -1270,8 +1225,8 @@ public class DateTimeUtils {
y--; y--;
m += 12; m += 12;
} }
long a = ((y * 2922L) >> 3) + DAYS_OFFSET[m - 3] + d - 719_484; long a = ((y * 1461L) >> 2) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y < 1901 || y > 2099) { if (y < 1900 || y > 2099) {
// Slow mode // Slow mode
a += (y / 400) - (y / 100) + 15; a += (y / 400) - (y / 100) + 15;
} }
...@@ -1286,7 +1241,7 @@ public class DateTimeUtils { ...@@ -1286,7 +1241,7 @@ public class DateTimeUtils {
*/ */
public static long dateValueFromAbsoluteDay(long absoluteDay) { public static long dateValueFromAbsoluteDay(long absoluteDay) {
long d = absoluteDay + 719_468; long d = absoluteDay + 719_468;
long y100 = 0, offset; long y100, offset;
if (d > 578_040) { if (d > 578_040) {
// Gregorian calendar // Gregorian calendar
long y400 = d / 146_097; long y400 = d / 146_097;
...@@ -1296,6 +1251,7 @@ public class DateTimeUtils { ...@@ -1296,6 +1251,7 @@ public class DateTimeUtils {
offset = y400 * 400 + y100 * 100; offset = y400 * 400 + y100 * 100;
} else { } else {
// Julian calendar // Julian calendar
y100 = 0;
d += 292_200_000_002L; d += 292_200_000_002L;
offset = -800_000_000; offset = -800_000_000;
} }
...@@ -1339,14 +1295,13 @@ public class DateTimeUtils { ...@@ -1339,14 +1295,13 @@ public class DateTimeUtils {
if (day < getDaysInMonth(year, month)) { if (day < getDaysInMonth(year, month)) {
return dateValue + 1; return dateValue + 1;
} }
day = 1;
if (month < 12) { if (month < 12) {
month++; month++;
} else { } else {
month = 1; month = 1;
year++; year++;
} }
return dateValue(year, month, day); return dateValue(year, month, 1);
} }
/** /**
......
...@@ -91,8 +91,7 @@ public class ToDateParser { ...@@ -91,8 +91,7 @@ public class ToDateParser {
} }
if (doyValid) { if (doyValid) {
dateValue = DateTimeUtils.dateValueFromAbsoluteDay( dateValue = DateTimeUtils.dateValueFromAbsoluteDay(
DateTimeUtils.absoluteDayFromDateValue(DateTimeUtils.dateValue(year, 1, 1)) DateTimeUtils.absoluteDayFromYear(year) + dayOfYear - 1);
+ dayOfYear - 1);
} else { } else {
int month = this.month; int month = this.month;
if (month == 0) { if (month == 0) {
......
...@@ -845,8 +845,7 @@ public abstract class Value { ...@@ -845,8 +845,7 @@ public abstract class Value {
case TIME: case TIME:
// because the time has set the date to 1970-01-01, // because the time has set the date to 1970-01-01,
// this will be the result // this will be the result
return ValueDate.fromDateValue( return ValueDate.fromDateValue(DateTimeUtils.EPOCH_DATE_VALUE);
DateTimeUtils.dateValue(1970, 1, 1));
case TIMESTAMP: case TIMESTAMP:
return ValueDate.fromDateValue( return ValueDate.fromDateValue(
((ValueTimestamp) this).getDateValue()); ((ValueTimestamp) this).getDateValue());
......
...@@ -142,6 +142,7 @@ import org.h2.test.store.TestKillProcessWhileWriting; ...@@ -142,6 +142,7 @@ import org.h2.test.store.TestKillProcessWhileWriting;
import org.h2.test.store.TestMVRTree; import org.h2.test.store.TestMVRTree;
import org.h2.test.store.TestMVStore; import org.h2.test.store.TestMVStore;
import org.h2.test.store.TestMVStoreBenchmark; import org.h2.test.store.TestMVStoreBenchmark;
import org.h2.test.store.TestMVStoreStopCompact;
import org.h2.test.store.TestMVStoreTool; import org.h2.test.store.TestMVStoreTool;
import org.h2.test.store.TestMVTableEngine; import org.h2.test.store.TestMVTableEngine;
import org.h2.test.store.TestObjectDataType; import org.h2.test.store.TestObjectDataType;
...@@ -889,6 +890,7 @@ kill -9 `jps -l | grep "org.h2.test." | cut -d " " -f 1` ...@@ -889,6 +890,7 @@ kill -9 `jps -l | grep "org.h2.test." | cut -d " " -f 1`
addTest(new TestMVRTree()); addTest(new TestMVRTree());
addTest(new TestMVStore()); addTest(new TestMVStore());
addTest(new TestMVStoreBenchmark()); addTest(new TestMVStoreBenchmark());
addTest(new TestMVStoreStopCompact());
addTest(new TestMVStoreTool()); addTest(new TestMVStoreTool());
addTest(new TestMVTableEngine()); addTest(new TestMVTableEngine());
addTest(new TestObjectDataType()); addTest(new TestObjectDataType());
......
...@@ -710,6 +710,11 @@ public class TestIndex extends TestBase { ...@@ -710,6 +710,11 @@ public class TestIndex extends TestBase {
trace("---done---"); trace("---done---");
} }
/**
* This method is called from the database.
*
* @return the result set
*/
public static ResultSet testFunctionIndexFunction() { public static ResultSet testFunctionIndexFunction() {
// There are additional callers like JdbcConnection.prepareCommand() and // There are additional callers like JdbcConnection.prepareCommand() and
// CommandContainer.recompileIfRequired() // CommandContainer.recompileIfRequired()
......
...@@ -13,6 +13,8 @@ import java.sql.PreparedStatement; ...@@ -13,6 +13,8 @@ import java.sql.PreparedStatement;
import java.sql.ResultSet; import java.sql.ResultSet;
import java.sql.SQLException; import java.sql.SQLException;
import java.sql.Statement; import java.sql.Statement;
import org.h2.api.ErrorCode;
import org.h2.test.TestBase; import org.h2.test.TestBase;
/** /**
...@@ -210,6 +212,7 @@ public class TestBatchUpdates extends TestBase { ...@@ -210,6 +212,7 @@ public class TestBatchUpdates extends TestBase {
String s = COFFEE_UPDATE; String s = COFFEE_UPDATE;
trace("Prepared Statement String:" + s); trace("Prepared Statement String:" + s);
prep = conn.prepareStatement(s); prep = conn.prepareStatement(s);
assertThrows(ErrorCode.PARAMETER_NOT_SET_1, prep).addBatch();
prep.setInt(1, 2); prep.setInt(1, 2);
prep.addBatch(); prep.addBatch();
prep.setInt(1, 3); prep.setInt(1, 3);
......
/*
* Copyright 2004-2018 H2 Group. Multiple-Licensed under the MPL 2.0,
* and the EPL 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.test.store;
import java.util.Random;
import org.h2.mvstore.MVMap;
import org.h2.mvstore.MVStore;
import org.h2.store.fs.FileUtils;
import org.h2.test.TestBase;
/**
* Test that the MVStore eventually stops optimizing (does not excessively opti
*/
public class TestMVStoreStopCompact extends TestBase {
/**
* Run just this test.
*
* @param a ignored
*/
public static void main(String... a) throws Exception {
TestBase test = TestBase.createCaller().init();
test.config.big = true;
test.test();
}
@Override
public void test() throws Exception {
if (!config.big) {
return;
}
for(int retentionTime = 10; retentionTime < 1000; retentionTime *= 10) {
for(int timeout = 100; timeout <= 1000; timeout *= 10) {
testStopCompact(retentionTime, timeout);
}
}
}
private void testStopCompact(int retentionTime, int timeout) throws InterruptedException {
String fileName = getBaseDir() + "/testStopCompact.h3";
FileUtils.createDirectories(getBaseDir());
FileUtils.delete(fileName);
// store with a very small page size, to make sure
// there are many leaf pages
MVStore s = new MVStore.Builder().
fileName(fileName).open();
s.setRetentionTime(retentionTime);
MVMap<Integer, String> map = s.openMap("data");
long start = System.currentTimeMillis();
Random r = new Random(1);
for (int i = 0; i < 4000000; i++) {
long time = System.currentTimeMillis() - start;
if (time > timeout) {
break;
}
int x = r.nextInt(10000000);
map.put(x, "Hello World " + i * 10);
}
s.setAutoCommitDelay(100);
long oldWriteCount = s.getFileStore().getWriteCount();
// expect background write to stop after 5 seconds
Thread.sleep(5000);
long newWriteCount = s.getFileStore().getWriteCount();
// expect that compaction didn't cause many writes
assertTrue(newWriteCount - oldWriteCount < 30);
s.close();
}
}
...@@ -26,7 +26,6 @@ public class TestClearReferences extends TestBase { ...@@ -26,7 +26,6 @@ public class TestClearReferences extends TestBase {
"org.h2.compress.CompressLZF.cachedHashTable", "org.h2.compress.CompressLZF.cachedHashTable",
"org.h2.engine.DbSettings.defaultSettings", "org.h2.engine.DbSettings.defaultSettings",
"org.h2.engine.SessionRemote.sessionFactory", "org.h2.engine.SessionRemote.sessionFactory",
"org.h2.expression.Function.MONTHS_AND_WEEKS",
"org.h2.jdbcx.JdbcDataSourceFactory.cachedTraceSystem", "org.h2.jdbcx.JdbcDataSourceFactory.cachedTraceSystem",
"org.h2.store.RecoverTester.instance", "org.h2.store.RecoverTester.instance",
"org.h2.store.fs.FilePath.defaultProvider", "org.h2.store.fs.FilePath.defaultProvider",
...@@ -37,6 +36,7 @@ public class TestClearReferences extends TestBase { ...@@ -37,6 +36,7 @@ public class TestClearReferences extends TestBase {
"org.h2.tools.CompressTool.cachedBuffer", "org.h2.tools.CompressTool.cachedBuffer",
"org.h2.util.CloseWatcher.queue", "org.h2.util.CloseWatcher.queue",
"org.h2.util.CloseWatcher.refs", "org.h2.util.CloseWatcher.refs",
"org.h2.util.DateTimeFunctions.MONTHS_AND_WEEKS",
"org.h2.util.DateTimeUtils.timeZone", "org.h2.util.DateTimeUtils.timeZone",
"org.h2.util.MathUtils.cachedSecureRandom", "org.h2.util.MathUtils.cachedSecureRandom",
"org.h2.util.NetUtils.cachedLocalAddress", "org.h2.util.NetUtils.cachedLocalAddress",
......
...@@ -371,6 +371,9 @@ public class TestDate extends TestBase { ...@@ -371,6 +371,9 @@ public class TestDate extends TestBase {
if (abs != next && next != Long.MIN_VALUE) { if (abs != next && next != Long.MIN_VALUE) {
assertEquals(abs, next); assertEquals(abs, next);
} }
if (m == 1 && d == 1) {
assertEquals(abs, DateTimeUtils.absoluteDayFromYear(y));
}
next = abs + 1; next = abs + 1;
long d2 = DateTimeUtils.dateValueFromAbsoluteDay(abs); long d2 = DateTimeUtils.dateValueFromAbsoluteDay(abs);
assertEquals(date, d2); assertEquals(date, d2);
......
...@@ -13,7 +13,7 @@ import java.util.Random; ...@@ -13,7 +13,7 @@ import java.util.Random;
import org.h2.message.DbException; import org.h2.message.DbException;
import org.h2.test.TestBase; import org.h2.test.TestBase;
import org.h2.test.utils.AssertThrows; import org.h2.test.utils.AssertThrows;
import org.h2.util.DateTimeUtils; import org.h2.util.DateTimeFunctions;
import org.h2.util.StringUtils; import org.h2.util.StringUtils;
/** /**
...@@ -85,7 +85,7 @@ public class TestStringUtils extends TestBase { ...@@ -85,7 +85,7 @@ public class TestStringUtils extends TestBase {
StringUtils.xmlText("Rand&Blue")); StringUtils.xmlText("Rand&Blue"));
assertEquals("&lt;&lt;[[[]]]&gt;&gt;", assertEquals("&lt;&lt;[[[]]]&gt;&gt;",
StringUtils.xmlCData("<<[[[]]]>>")); StringUtils.xmlCData("<<[[[]]]>>"));
Date dt = DateTimeUtils.parseDateTime( Date dt = DateTimeFunctions.parseDateTime(
"2001-02-03 04:05:06 GMT", "2001-02-03 04:05:06 GMT",
"yyyy-MM-dd HH:mm:ss z", "en", "GMT"); "yyyy-MM-dd HH:mm:ss z", "en", "GMT");
String s = StringUtils.xmlStartDoc() String s = StringUtils.xmlStartDoc()
...@@ -99,10 +99,10 @@ public class TestStringUtils extends TestBase { ...@@ -99,10 +99,10 @@ public class TestStringUtils extends TestBase {
+ StringUtils.xmlNode("description", null, "H2 Database Engine") + StringUtils.xmlNode("description", null, "H2 Database Engine")
+ StringUtils.xmlNode("language", null, "en-us") + StringUtils.xmlNode("language", null, "en-us")
+ StringUtils.xmlNode("pubDate", null, + StringUtils.xmlNode("pubDate", null,
DateTimeUtils.formatDateTime(dt, DateTimeFunctions.formatDateTime(dt,
"EEE, d MMM yyyy HH:mm:ss z", "en", "GMT")) "EEE, d MMM yyyy HH:mm:ss z", "en", "GMT"))
+ StringUtils.xmlNode("lastBuildDate", null, + StringUtils.xmlNode("lastBuildDate", null,
DateTimeUtils.formatDateTime(dt, DateTimeFunctions.formatDateTime(dt,
"EEE, d MMM yyyy HH:mm:ss z", "en", "GMT")) "EEE, d MMM yyyy HH:mm:ss z", "en", "GMT"))
+ StringUtils.xmlNode("item", null, + StringUtils.xmlNode("item", null,
StringUtils.xmlNode("title", null, StringUtils.xmlNode("title", null,
......
...@@ -767,3 +767,8 @@ interpolated thead ...@@ -767,3 +767,8 @@ interpolated thead
die weekdiff osx subprocess dow proleptic microsecond microseconds divisible cmp denormalized suppressed saturated mcs die weekdiff osx subprocess dow proleptic microsecond microseconds divisible cmp denormalized suppressed saturated mcs
london dfs weekdays intermittent looked msec tstz africa monrovia asia tokyo weekday joi callers multipliers ucn london dfs weekdays intermittent looked msec tstz africa monrovia asia tokyo weekday joi callers multipliers ucn
openoffice organize libre systemtables gmane sea borders announced millennium alex nordlund rarely openoffice organize libre systemtables gmane sea borders announced millennium alex nordlund rarely
opti excessively
iterators tech enums incompatibilities loses reimplement readme reorganize
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论