提交 99301b9e authored 作者: andrei's avatar andrei

Merge remote-tracking branch 'h2database/master' into non_blocking

# Conflicts:
#	h2/src/main/org/h2/mvstore/MVStore.java
......@@ -21,7 +21,257 @@ Change Log
<h2>Next Version (unreleased)</h2>
<ul>
<li>PR #967: Adds ARRAY_AGG()
<li>PR #984: Minor refactorings in Parser
</li>
<li>Issue #933: MVStore background writer endless loop
</li>
<li>PR #981: Reorganize date-time functions
</li>
<li>PR #980: Add Parser.toString() method for improved debugging experience
</li>
<li>PR #979: Remove support of TCP protocol versions 6 and 7
</li>
<li>PR #977: Add database versions to javadoc of TCP protocol versions and update dictionary.txt
</li>
<li>PR #976: Add and use incrementDateValue() and decrementDateValue()
</li>
<li>Issue #974: Inline PRIMARY KEY definition loses its name
</li>
<li>PR #972: Add META-INF/versions to all non-Android jars that use Bits
</li>
<li>PR #971: Update ASM from 6.1-beta to 6.1
</li>
<li>PR #970: Added support for ENUM in prepared statement where clause
</li>
<li>PR #968: Assorted changes
</li>
<li>PR #967: Adds ARRAY_AGG function
</li>
<li>PR #966: Do not include help and images in client jar
</li>
<li>PR #965: Do not include mvstore.DataUtils in client jar and other changes
</li>
<li>PR #964: Fix TestFunctions.testToCharFromDateTime()
</li>
<li>PR #963 / Issue #962: Improve documentation of compatibility modes and fix ssl URL description
</li>
<li>Issue #219: H2 mode MySQL- ON UPDATE CURRENT_TIMESTAMP not supported
</li>
<li>PR #958: More fixes for PgServer
</li>
<li>PR #957: Update database size information and links in README.md
</li>
<li>PR #956: Move tests added in 821117f1db120a265647a063dca13ab5bee98efc to a proper place
</li>
<li>PR #955: Support getObject(?, Class) in generated keys
</li>
<li>PR #954: Avoid incorrect reads in iterators of TransactionMap
</li>
<li>PR #952: Optimize arguments for MVMap.init()
</li>
<li>PR #949: Fix table borders in PDF and other changes
</li>
<li>PR #948: Fix some grammar descriptions and ALTER TABLE DROP COLUMN parsing
</li>
<li>PR #947: Fix building of documentation and use modern names of Java versions
</li>
<li>PR #943: Assorted changes in documentation and a fix for current-time.sql
</li>
<li>PR #942: Fix page numbers in TOC in PDF and move System Tables into own HTML / section in PDF
</li>
<li>PR #941: Use >> syntax in median.sql and move out more tests from testScript.sql
</li>
<li>PR #940: add Support for MySQL: DROP INDEX index_name ON tbl_name
</li>
<li>PR #939: Short syntax for SQL tests
</li>
<li>Issue #935: The "date_trunc" function is not recognized for 'day'
</li>
<li>PR #936: Fix font size, line length, TOC, and many broken links in PDF
</li>
<li>PR #931: Assorted changes in documentation
</li>
<li>PR #930: Use Math.log10() and remove Mode.getOracle()
</li>
<li>PR #929: Remove Mode.supportOffsetFetch
</li>
<li>PR #928: Show information about office configuration instead of fallback PDF generation mode
</li>
<li>PR #926: Describe datetime fields in documentation
</li>
<li>PR #925: Fix time overflow in DATEADD
</li>
<li>Issue #416: Add support for DROP SCHEMA x { RESTRICT | CASCADE }
</li>
<li>PR #922: Parse and treat fractional seconds precision as described in SQL standard
</li>
<li>Issue #919: Add support for mixing adding constraints and columns in multi-add ALTER TABLE statement
</li>
<li>PR #916: Implement TABLE_CONSTRAINTS and REFERENTIAL_CONSTRAINTS from the SQL standard
</li>
<li>PR #915: Implement INFORMATION_SCHEMA.KEY_COLUMN_USAGE from SQL standard
</li>
<li>PR #914: don't allow null values in ConcurrentArrayList
</li>
<li>PR #913: Assorted changes in tests and documentation
</li>
<li>Issue #755: Missing FLOAT(precision)?
</li>
<li>PR #911: Add support for MySQL-style ALTER TABLE ADD ... FIRST
</li>
<li>Issue #409: Support derived column list syntax on table alias
</li>
<li>PR #908: remove dead code
</li>
<li>PR #907: Nest joins only if required and fix some issues with complex joins
</li>
<li>PR #906: Fix obscure error on non-standard SELECT * FROM A LEFT JOIN B NATURAL JOIN C
</li>
<li>PR #805: Move some JOIN tests from testScript.sql to own file
</li>
<li>PR #804: Remove unused parameters from readJoin() and readTableFilter()
</li>
<li>Issue #322: CSVREAD WHERE clause containing ORs duplicates number of rows
</li>
<li>PR #902: Remove DbSettings.nestedJoins
</li>
<li>PR #900: Convert duplicate anonymous classes in TableFilter to nested for reuse
</li>
<li>PR #899: Fix ON DUPLICATE KEY UPDATE for inserts with multiple rows
</li>
<li>PR #898: Parse TIME WITHOUT TIME ZONE and fix TIMESTAMP as column name
</li>
<li>PR #897: Update JTS to version 1.15.0 from LocationTech
</li>
<li>PR #896: Assorted changes in help.csv
</li>
<li>PR #895: Parse more variants of timestamps with time zones
</li>
<li>PR #893: TIMESTAMP WITHOUT TIME ZONE, TIMEZONE_HOUR, and TIMEZONE_MINUTE
</li>
<li>PR #892: Assorted minor changes in Parser
</li>
<li>PR #891: Update documentation of date-time types and clean up related code a bit
</li>
<li>PR #890: Implement conversions for TIMESTAMP WITH TIME ZONE
</li>
<li>PR #888: Fix two-phase commit in MVStore
</li>
<li>Issue #884: Wrong test Resources path in pom.xml
</li>
<li>PR #886: Fix building of documentation
</li>
<li>PR #883: Add support for TIMESTAMP WITH TIME ZONE to FORMATDATETIME
</li>
<li>PR #881: Reimplement dateValueFromDate() and nanosFromDate() without a Calendar
</li>
<li>PR #880: Assorted date-time related changes
</li>
<li>PR #879: Reimplement TO_DATE without a Calendar and fix a lot of bugs an incompatibilities
</li>
<li>PR #878: Fix IYYY in TO_CHAR and reimplement TRUNCATE without a Calendar
</li>
<li>PR #877: Reimplement TO_CHAR without a Calendar and fix 12 AM / 12 PM in it
</li>
<li>PR #876: Test out of memory
</li>
<li>PR #875: Improve date-time related parts of documentation
</li>
<li>PR #872: Assorted date-time related changes
</li>
<li>PR #871: Fix OOME in Transfer.readValue() with large CLOB V2
</li>
<li>PR #867: TestOutOfMemory stability
</li>
<li>Issue #834: Add support for the SQL standard FILTER clause on aggregate functions
</li>
<li>PR #864: Minor changes in DateUtils and Function
</li>
<li>PR #863: Polish: use isEmpty() to check whether the collection is empty or not.
</li>
<li>PR #862: Convert constraint type into enum
</li>
<li>PR #861: Avoid resource leak
</li>
<li>PR #860: IndexCursor inList
</li>
<li>PR #858 / Issue #690 and others: Return all generated rows and columns from getGeneratedKeys()
</li>
<li>Make the JDBC client independent of the database engine
</li>
<li>PR #857: Do not write each SQL error multiple times in TestScript
</li>
<li>PR #856: Fix TestDateTimeUtils.testDayOfWeek() and example with ANY(?
</li>
<li>PR #855: Reimplement DATEADD without a Calendar and fix some incompatibilities
</li>
<li>PR #854: Improve test stability
</li>
<li>PR #851: Reimplement DATEDIFF without a Calendar
</li>
<li>Issue #502: SQL "= ANY (?)" supported?
</li>
<li>PR #849: Encode date and time in fast and proper way in PgServerThread
</li>
<li>PR #847: Reimplement remaining parts of EXTRACT, ISO_YEAR, etc without a Calendar
</li>
<li>PR #846: Read known fields directly in DateTimeUtils.getDatePart()
</li>
<li>Issue #832: Extract EPOCH from a timestamp
</li>
<li>PR #844: Add simple implementations of isWrapperFor() and unwrap() to JdbcDataSource
</li>
<li>PR #843: Add MEDIAN to help.csv and fix building of documentation
</li>
<li>PR #841: Support indexes with nulls last for MEDIAN aggregate
</li>
<li>PR #840: Add MEDIAN aggregate
</li>
<li>PR #839: TestTools should not leave testing thread in interrupted state
</li>
<li>PR #838: (tests) Excessive calls to Runtime.getRuntime().gc() cause OOM for no reason
</li>
<li>Don't use substring when doing StringBuffer#append
</li>
<li>PR #837: Use StringUtils.replaceAll() in Function.replace()
</li>
<li>PR #836: Allow to read invalid February 29 dates with LocalDate as March 1
</li>
<li>PR #835: Inline getTimeTry() into DateTimeUtils.getMillis()
</li>
<li>PR #827: Use dateValueFromDate() and nanosFromDate() in parseTimestamp()
</li>
<li>Issue #115: to_char fails with pattern FM0D099
</li>
<li>PR #825: Merge code for parsing and formatting timestamp values
</li>
<li>Enums for ConstraintActionType, UnionType, and OpType
</li>
<li>PR 824: Add partial support for INSERT IGNORE in MySQL mode
</li>
<li>PR #823: Use ValueByte.getInt() and ValueShort.getInt() in convertTo()
</li>
<li>PR #820: Fix some compiler warnings
</li>
<li>PR #818: Fixes for remaining issues with boolean parameters
</li>
<li>Use enum for file lock method
</li>
<li>PR #817: Parse also 1 as true and 0 as false in Utils.parseBoolean()
</li>
<li>PR #815: Fix count of completed statements
</li>
<li>PR #814: Method.isVarArgs() is available on all supported platforms
</li>
<li>Issue #812: TIME values should be in range 0:00:00.000000000 23:59:59.999999999?
</li>
<li>PR #811: Issues with Boolean.parseBoolean()
</li>
<li>PR #809: Use type constants from LocalDateTimeUtils directly
</li>
<li>PR #808: Use HmacSHA256 provided by JRE
</li>
<li>PR #807: Use SHA-256 provided by JRE / Android and use rotateLeft / Right in Fog
</li>
<li>PR #806: Implement setBytes() and setString() with offset and len
</li>
......
......@@ -79,47 +79,63 @@ The following can be skipped currently; benchmarks should probably be removed:
## Build the Release
Change directory to src/installer
Run ./buildRelease.sh (non-Windows) or buildRelease.bat (Windows)
Run the following commands:
Non-Windows:
Scan for viruses
Test installer, H2 Console (test new languages)
Check docs, versions and links in main, downloads, build numbers
Check the PDF file size
cd src/installer
./buildRelease.sh
Windows:
cd src/installer
buildRelease.bat
Scan for viruses.
Test installer, H2 Console (test new languages).
Check docs, versions and links in main, downloads, build numbers.
Check the PDF file size.
Upload (http and https) to ftp://h2database.com/javadoc
Upload (http and https) to ftp://h2database.com
Upload (http and https) to ftp://h2database.com/m2-repo
Github: create a release
Newsletter: prepare (always to BCC)
Newsletter: send to h2-database-jp@googlegroups.com; h2-database@googlegroups.com; h2database-news@googlegroups.com; ...
Add to http://twitter.com
- tweet: add @geospatialnews for the new geometry type and disk spatial index
Github: create a release.
Newsletter: send (always to BCC!), the following:
h2-database-jp@googlegroups.com; h2-database@googlegroups.com; h2database-news@googlegroups.com; ...
Create tweet at http://twitter.com
Sign files and publish files on Maven Central
(check java version is 1.7)
./build.sh clean compile jar mavenDeployCentral
cd /data/h2database/m2-repo/com/h2database
# remove sha and md5 files:
find . -name "*.sha1" -delete
find . -name "*.md5" -delete
cd h2/1...
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-...
jar -cvf bundle.jar h2-*
cd ../../h2-mvstore/1...
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-mvstore...
jar -cvf bundle.jar h2-*
# http://central.sonatype.org/pages/ossrh-guide.html
# http://central.sonatype.org/pages/manual-staging-bundle-creation-and-deployment.html
# https://oss.sonatype.org/#welcome - Log In "t..."
# - Staging Upload
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
# - Staging Upload
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
Update statistics
Change version in pom.xml, commit
./build.sh clean compile jar mavenDeployCentral
cd /data/h2database/m2-repo/com/h2database
# remove sha and md5 files:
find . -name "*.sha1" -delete
find . -name "*.md5" -delete
cd h2/1...
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-...
jar -cvf bundle.jar h2-*
cd ../../h2-mvstore/1...
# for each file separately (-javadoc.jar, -sources.jar, .jar, .pom):
gpg -u "Thomas Mueller Graf <thomas.tom.mueller@gmail.com>" -ab h2-mvstore...
jar -cvf bundle.jar h2-*
# http://central.sonatype.org/pages/ossrh-guide.html
# http://central.sonatype.org/pages/manual-staging-bundle-creation-and-deployment.html
# https://oss.sonatype.org/#welcome - Log In "t..."
# - Staging Upload
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
# - Staging Upload
# - Upload Mode: Artifact Bundle, Select Bundle to Upload... - /data/.../bundle.jar
# - Upload Bundle - Staging Repositories - select comh2database - Release - Confirm
Update statistics.
Change version in pom.xml, commit.
......@@ -147,6 +147,7 @@ import org.h2.table.Table;
import org.h2.table.TableFilter;
import org.h2.table.TableFilter.TableFilterVisitor;
import org.h2.table.TableView;
import org.h2.util.DateTimeFunctions;
import org.h2.util.MathUtils;
import org.h2.util.New;
import org.h2.util.ParserUtil;
......@@ -793,8 +794,7 @@ public class Parser {
do {
Column column = readTableColumn(filter);
columns.add(column);
} while (readIf(","));
read(")");
} while (readIfMore(true));
read("=");
Expression expression = readExpression();
if (columns.size() == 1) {
......@@ -905,8 +905,7 @@ public class Parser {
column.sortType |= SortOrder.NULLS_LAST;
}
}
} while (readIf(","));
read(")");
} while (readIfMore(true));
return columns.toArray(new IndexColumn[0]);
}
......@@ -915,7 +914,7 @@ public class Parser {
do {
String columnName = readColumnIdentifier();
columns.add(columnName);
} while (readIfMore());
} while (readIfMore(false));
return columns.toArray(new String[0]);
}
......@@ -930,7 +929,7 @@ public class Parser {
column.getSQL());
}
columns.add(column);
} while (readIfMore());
} while (readIfMore(false));
}
return columns.toArray(new Column[0]);
}
......@@ -943,9 +942,16 @@ public class Parser {
return table.getColumn(id);
}
private boolean readIfMore() {
/**
* Read comma or closing brace.
*
* @param strict
* if {@code false} additional comma before brace is allowed
* @return {@code true} if comma is read, {@code false} if brace is read
*/
private boolean readIfMore(boolean strict) {
if (readIf(",")) {
return !readIf(")");
return strict || !readIf(")");
}
read(")");
return false;
......@@ -1109,7 +1115,7 @@ public class Parser {
} else {
values.add(readExpression());
}
} while (readIfMore());
} while (readIfMore(false));
}
command.addRow(values.toArray(new Expression[0]));
} while (readIf(","));
......@@ -1280,7 +1286,7 @@ public class Parser {
} else {
values.add(readExpression());
}
} while (readIfMore());
} while (readIfMore(false));
}
command.addRow(values.toArray(new Expression[0]));
// the following condition will allow (..),; and (..);
......@@ -1339,7 +1345,7 @@ public class Parser {
} else {
values.add(readExpression());
}
} while (readIfMore());
} while (readIfMore(false));
}
command.addRow(values.toArray(new Expression[0]));
} while (readIf(","));
......@@ -1475,8 +1481,7 @@ public class Parser {
String indexName = readIdentifierWithSchema();
Index index = table.getIndex(indexName);
indexNames.add(index.getName());
} while (readIf(","));
read(")");
} while (readIfMore(true));
}
return IndexHints.createUseIndexHints(indexNames);
}
......@@ -1502,8 +1507,7 @@ public class Parser {
ArrayList<String> derivedColumnNames = New.arrayList();
do {
derivedColumnNames.add(readAliasIdentifier());
} while (readIf(","));
read(")");
} while (readIfMore(true));
return derivedColumnNames;
}
return null;
......@@ -2721,8 +2725,7 @@ public class Parser {
ArrayList<Expression> params = New.arrayList();
do {
params.add(readExpression());
} while (readIf(","));
read(")");
} while (readIfMore(true));
Expression filterCondition;
if (readIf("FILTER")) {
read("(");
......@@ -2798,7 +2801,7 @@ public class Parser {
}
case Function.DATE_ADD:
case Function.DATE_DIFF: {
if (Function.isDatePart(currentToken)) {
if (DateTimeFunctions.isDatePart(currentToken)) {
function.setParameter(0,
ValueExpression.get(ValueString.get(currentToken)));
read();
......@@ -2893,8 +2896,7 @@ public class Parser {
read("=");
function.setParameter(i, readExpression());
i++;
} while (readIf(","));
read(")");
} while (readIfMore(true));
TableFunction tf = (TableFunction) function;
tf.setColumns(columns);
break;
......@@ -2914,8 +2916,7 @@ public class Parser {
int i = 0;
do {
function.setParameter(i++, readExpression());
} while (readIf(","));
read(")");
} while (readIfMore(true));
}
}
function.doneWithParameters();
......@@ -3541,19 +3542,6 @@ public class Parser {
return false;
}
/*
* Reads passed token in list, in order and returns true on first match.
* If none of the token matches returns false
*/
private boolean readIfOr(String... tokens) {
for (String token: tokens) {
if (readIf(token)) {
return true;
}
}
return false;
}
/*
* Reads every token in list, in order - returns true if all are found.
* If any are not found, returns false - AND resets parsing back to state when called.
......@@ -4153,40 +4141,39 @@ public class Parser {
break;
}
} else if (s.length() == 2) {
char c1 = s.charAt(1);
switch (c0) {
case ':':
if ("::".equals(s)) {
return KEYWORD;
} else if (":=".equals(s)) {
if (c1 == ':' || c1 == '=') {
return KEYWORD;
}
break;
case '>':
if (">=".equals(s)) {
if (c1 == '=') {
return BIGGER_EQUAL;
}
break;
case '<':
if ("<=".equals(s)) {
if (c1 == '=') {
return SMALLER_EQUAL;
} else if ("<>".equals(s)) {
} else if (c1 == '>') {
return NOT_EQUAL;
}
break;
case '!':
if ("!=".equals(s)) {
if (c1 == '=') {
return NOT_EQUAL;
} else if ("!~".equals(s)) {
} else if (c1 == '~') {
return KEYWORD;
}
break;
case '|':
if ("||".equals(s)) {
if (c1 == '|') {
return STRING_CONCAT;
}
break;
case '&':
if ("&&".equals(s)) {
if (c1 == '&') {
return SPATIAL_INTERSECTS;
}
break;
......@@ -4490,7 +4477,9 @@ public class Parser {
}
original += "(" + p;
// Oracle syntax
readIfOr("CHAR", "BYTE");
if (!readIf("CHAR")) {
readIf("BYTE");
}
if (dataType.supportsScale) {
if (readIf(",")) {
scale = readInt();
......@@ -4524,13 +4513,12 @@ public class Parser {
String enumerator0 = readString();
enumeratorList.add(enumerator0);
original += "'" + enumerator0 + "'";
while (readIf(",")) {
while (readIfMore(true)) {
original += ',';
String enumeratorN = readString();
original += "'" + enumeratorN + "'";
enumeratorList.add(enumeratorN);
}
read(")");
original += ')';
enumerators = enumeratorList.toArray(new String[0]);
}
......@@ -4850,10 +4838,7 @@ public class Parser {
columns.set(i, column);
row.add(expr);
i++;
} while (multiColumn && readIf(","));
if (multiColumn) {
read(")");
}
} while (multiColumn && readIfMore(true));
rows.add(row);
} while (readIf(","));
int columnCount = columns.size();
......@@ -6357,8 +6342,7 @@ public class Parser {
command.setIfNotExists(false);
do {
parseTableColumnDefinition(command, schema, tableName);
} while (readIf(","));
read(")");
} while (readIfMore(true));
} else {
boolean ifNotExists = readIfNotExists();
command.setIfNotExists(ifNotExists);
......@@ -6609,7 +6593,7 @@ public class Parser {
if (!readIf(")")) {
do {
parseTableColumnDefinition(command, schema, tableName);
} while (readIfMore());
} while (readIfMore(false));
}
}
// Allows "COMMENT='comment'" in DDL statements (MySQL syntax)
......
......@@ -84,6 +84,9 @@ public class AlterTableAlterColumn extends CommandWithColumns {
this.oldColumn = oldColumn;
}
/**
* Add the column as the first column of the table.
*/
public void setAddFirst() {
addFirst = true;
}
......
......@@ -60,6 +60,12 @@ public abstract class CommandWithColumns extends SchemaCommand {
}
}
/**
* For the given list of columns, disable "nullable" for those columns that
* are primary key columns.
*
* @param columns the list of columns
*/
protected void changePrimaryKeysToNotNull(ArrayList<Column> columns) {
if (pkColumns != null) {
for (Column c : columns) {
......@@ -72,6 +78,9 @@ public abstract class CommandWithColumns extends SchemaCommand {
}
}
/**
* Create the constraints.
*/
protected void createConstraints() {
if (constraintCommands != null) {
for (DefineCommand command : constraintCommands) {
......@@ -81,6 +90,15 @@ public abstract class CommandWithColumns extends SchemaCommand {
}
}
/**
* For the given list of columns, create sequences for auto-increment
* columns (if needed), and then get the list of all sequences of the
* columns.
*
* @param columns the columns
* @param temporary whether generated sequences should be temporary
* @return the list of sequences (may be empty)
*/
protected ArrayList<Sequence> generateSequences(ArrayList<Column> columns, boolean temporary) {
ArrayList<Sequence> sequences = New.arrayList();
if (columns != null) {
......
......@@ -10,16 +10,12 @@ import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.math.BigDecimal;
import java.nio.charset.StandardCharsets;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.text.DateFormatSymbols;
import java.util.ArrayList;
import java.util.GregorianCalendar;
import java.util.HashMap;
import java.util.Locale;
import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException;
......@@ -44,6 +40,7 @@ import org.h2.table.Table;
import org.h2.table.TableFilter;
import org.h2.tools.CompressTool;
import org.h2.tools.Csv;
import org.h2.util.DateTimeFunctions;
import org.h2.util.DateTimeUtils;
import org.h2.util.IOUtils;
import org.h2.util.JdbcUtils;
......@@ -60,7 +57,6 @@ import org.h2.value.ValueArray;
import org.h2.value.ValueBoolean;
import org.h2.value.ValueBytes;
import org.h2.value.ValueDate;
import org.h2.value.ValueDecimal;
import org.h2.value.ValueDouble;
import org.h2.value.ValueInt;
import org.h2.value.ValueLong;
......@@ -150,14 +146,8 @@ public class Function extends Expression implements FunctionCall {
private static final long PRECISION_UNKNOWN = -1;
private static final HashMap<String, FunctionInfo> FUNCTIONS = new HashMap<>();
private static final HashMap<String, Integer> DATE_PART = new HashMap<>();
private static final char[] SOUNDEX_INDEX = new char[128];
/**
* English names of months and week days.
*/
private static volatile String[][] MONTHS_AND_WEEKS;
protected Expression[] args;
private final FunctionInfo info;
......@@ -168,54 +158,6 @@ public class Function extends Expression implements FunctionCall {
private final Database database;
static {
// DATE_PART
DATE_PART.put("SQL_TSI_YEAR", YEAR);
DATE_PART.put("YEAR", YEAR);
DATE_PART.put("YYYY", YEAR);
DATE_PART.put("YY", YEAR);
DATE_PART.put("SQL_TSI_MONTH", MONTH);
DATE_PART.put("MONTH", MONTH);
DATE_PART.put("MM", MONTH);
DATE_PART.put("M", MONTH);
DATE_PART.put("QUARTER", QUARTER);
DATE_PART.put("SQL_TSI_WEEK", WEEK);
DATE_PART.put("WW", WEEK);
DATE_PART.put("WK", WEEK);
DATE_PART.put("WEEK", WEEK);
DATE_PART.put("ISO_WEEK", ISO_WEEK);
DATE_PART.put("DAY", DAY_OF_MONTH);
DATE_PART.put("DD", DAY_OF_MONTH);
DATE_PART.put("D", DAY_OF_MONTH);
DATE_PART.put("SQL_TSI_DAY", DAY_OF_MONTH);
DATE_PART.put("DAY_OF_WEEK", DAY_OF_WEEK);
DATE_PART.put("DAYOFWEEK", DAY_OF_WEEK);
DATE_PART.put("DOW", DAY_OF_WEEK);
DATE_PART.put("ISO_DAY_OF_WEEK", ISO_DAY_OF_WEEK);
DATE_PART.put("DAYOFYEAR", DAY_OF_YEAR);
DATE_PART.put("DAY_OF_YEAR", DAY_OF_YEAR);
DATE_PART.put("DY", DAY_OF_YEAR);
DATE_PART.put("DOY", DAY_OF_YEAR);
DATE_PART.put("SQL_TSI_HOUR", HOUR);
DATE_PART.put("HOUR", HOUR);
DATE_PART.put("HH", HOUR);
DATE_PART.put("SQL_TSI_MINUTE", MINUTE);
DATE_PART.put("MINUTE", MINUTE);
DATE_PART.put("MI", MINUTE);
DATE_PART.put("N", MINUTE);
DATE_PART.put("SQL_TSI_SECOND", SECOND);
DATE_PART.put("SECOND", SECOND);
DATE_PART.put("SS", SECOND);
DATE_PART.put("S", SECOND);
DATE_PART.put("MILLISECOND", MILLISECOND);
DATE_PART.put("MS", MILLISECOND);
DATE_PART.put("EPOCH", EPOCH);
DATE_PART.put("MICROSECOND", MICROSECOND);
DATE_PART.put("MCS", MICROSECOND);
DATE_PART.put("NANOSECOND", NANOSECOND);
DATE_PART.put("NS", NANOSECOND);
DATE_PART.put("TIMEZONE_HOUR", TIMEZONE_HOUR);
DATE_PART.put("TIMEZONE_MINUTE", TIMEZONE_MINUTE);
// SOUNDEX_INDEX
String index = "7AEIOUY8HW1BFPV2CGJKQSXZ3DT4L5MN6R";
char number = 0;
......@@ -853,7 +795,7 @@ public class Function extends Expression implements FunctionCall {
break;
case DAY_NAME: {
int dayOfWeek = DateTimeUtils.getSundayDayOfWeek(DateTimeUtils.dateAndTimeFromValue(v0)[0]);
result = ValueString.get(getMonthsAndWeeks(1)[dayOfWeek],
result = ValueString.get(DateTimeFunctions.getMonthsAndWeeks(1)[dayOfWeek],
database.getMode().treatEmptyStringsAsNull);
break;
}
......@@ -870,11 +812,11 @@ public class Function extends Expression implements FunctionCall {
case SECOND:
case WEEK:
case YEAR:
result = ValueInt.get(getIntDatePart(v0, info.type));
result = ValueInt.get(DateTimeFunctions.getIntDatePart(v0, info.type));
break;
case MONTH_NAME: {
int month = DateTimeUtils.monthFromDateValue(DateTimeUtils.dateAndTimeFromValue(v0)[0]);
result = ValueString.get(getMonthsAndWeeks(0)[month - 1],
result = ValueString.get(DateTimeFunctions.getMonthsAndWeeks(0)[month - 1],
database.getMode().treatEmptyStringsAsNull);
break;
}
......@@ -1470,7 +1412,7 @@ public class Function extends Expression implements FunctionCall {
v1 == null ? null : v1.getString());
break;
case ADD_MONTHS:
result = dateadd("MONTH", v1.getInt(), v0);
result = DateTimeFunctions.dateadd("MONTH", v1.getInt(), v0);
break;
case TO_TIMESTAMP_TZ:
result = ToDateParser.toTimestampTz(v0.getString(),
......@@ -1489,10 +1431,10 @@ public class Function extends Expression implements FunctionCall {
database.getMode().treatEmptyStringsAsNull);
break;
case DATE_ADD:
result = dateadd(v0.getString(), v1.getLong(), v2);
result = DateTimeFunctions.dateadd(v0.getString(), v1.getLong(), v2);
break;
case DATE_DIFF:
result = ValueLong.get(datediff(v0.getString(), v1, v2));
result = ValueLong.get(DateTimeFunctions.datediff(v0.getString(), v1, v2));
break;
case DATE_TRUNC:
// Retrieve the time unit (e.g. 'day', 'microseconds', etc.)
......@@ -1501,60 +1443,9 @@ public class Function extends Expression implements FunctionCall {
result = DateTimeUtils.truncateDate(timeUnit, v1);
break;
case EXTRACT: {
int field = getDatePart(v0.getString());
if (field != EPOCH) {
result = ValueInt.get(getIntDatePart(v1, field));
} else {
// Case where we retrieve the EPOCH time.
// First we retrieve the dateValue and his time in nanoseconds.
long[] a = DateTimeUtils.dateAndTimeFromValue(v1);
long dateValue = a[0];
long timeNanos = a[1];
// We compute the time in nanoseconds and the total number of days.
BigDecimal timeNanosBigDecimal = new BigDecimal(timeNanos);
BigDecimal numberOfDays = new BigDecimal(DateTimeUtils.absoluteDayFromDateValue(dateValue));
BigDecimal nanosSeconds = new BigDecimal(1_000_000_000);
BigDecimal secondsPerDay = new BigDecimal(DateTimeUtils.SECONDS_PER_DAY);
// Case where the value is of type time e.g. '10:00:00'
if (v1 instanceof ValueTime) {
// In order to retrieve the EPOCH time we only have to convert the time
// in nanoseconds (previously retrieved) in seconds.
result = ValueDecimal.get(timeNanosBigDecimal.divide(nanosSeconds));
} else if (v1 instanceof ValueDate) {
// Case where the value is of type date '2000:01:01', we have to retrieve the
// total number of days and multiply it by the number of seconds in a day.
result = ValueDecimal.get(numberOfDays.multiply(secondsPerDay));
} else if (v1 instanceof ValueTimestampTimeZone) {
// Case where the value is a of type ValueTimestampTimeZone
// ('2000:01:01 10:00:00+05').
// We retrieve the time zone offset in minutes
ValueTimestampTimeZone v = (ValueTimestampTimeZone) v1;
BigDecimal timeZoneOffsetSeconds = new BigDecimal(v.getTimeZoneOffsetMins() * 60);
// Sum the time in nanoseconds and the total number of days in seconds
// and adding the timeZone offset in seconds.
result = ValueDecimal.get(timeNanosBigDecimal.divide(nanosSeconds)
.add(numberOfDays.multiply(secondsPerDay)).subtract(timeZoneOffsetSeconds));
} else {
// By default, we have the date and the time ('2000:01:01 10:00:00') if no type
// is given.
// We just have to sum the time in nanoseconds and the total number of days in
// seconds.
result = ValueDecimal
.get(timeNanosBigDecimal.divide(nanosSeconds).add(numberOfDays.multiply(secondsPerDay)));
}
}
case EXTRACT:
result = DateTimeFunctions.extract(v0.getString(), v1);
break;
}
case FORMATDATETIME: {
if (v0 == ValueNull.INSTANCE || v1 == ValueNull.INSTANCE) {
result = ValueNull.INSTANCE;
......@@ -1567,7 +1458,7 @@ public class Function extends Expression implements FunctionCall {
tz = DateTimeUtils.timeZoneNameFromOffsetMins(
((ValueTimestampTimeZone) v0).getTimeZoneOffsetMins());
}
result = ValueString.get(DateTimeUtils.formatDateTime(
result = ValueString.get(DateTimeFunctions.formatDateTime(
v0.getTimestamp(), v1.getString(), locale, tz),
database.getMode().treatEmptyStringsAsNull);
}
......@@ -1581,7 +1472,7 @@ public class Function extends Expression implements FunctionCall {
null : v2 == ValueNull.INSTANCE ? null : v2.getString();
String tz = v3 == null ?
null : v3 == ValueNull.INSTANCE ? null : v3.getString();
java.util.Date d = DateTimeUtils.parseDateTime(
java.util.Date d = DateTimeFunctions.parseDateTime(
v0.getString(), v1.getString(), locale, tz);
result = ValueTimestamp.fromMillis(d.getTime());
}
......@@ -1849,244 +1740,6 @@ public class Function extends Expression implements FunctionCall {
return bytes;
}
/**
* Check if a given string is a valid date part string.
*
* @param part the string
* @return true if it is
*/
public static boolean isDatePart(String part) {
Integer p = DATE_PART.get(StringUtils.toUpperEnglish(part));
return p != null;
}
private static int getDatePart(String part) {
Integer p = DATE_PART.get(StringUtils.toUpperEnglish(part));
if (p == null) {
throw DbException.getInvalidValueException("date part", part);
}
return p.intValue();
}
private static Value dateadd(String part, long count, Value v) {
int field = getDatePart(part);
if (field != MILLISECOND && field != MICROSECOND && field != NANOSECOND &&
(count > Integer.MAX_VALUE || count < Integer.MIN_VALUE)) {
throw DbException.getInvalidValueException("DATEADD count", count);
}
boolean withDate = !(v instanceof ValueTime);
boolean withTime = !(v instanceof ValueDate);
boolean forceTimestamp = false;
long[] a = DateTimeUtils.dateAndTimeFromValue(v);
long dateValue = a[0];
long timeNanos = a[1];
switch (field) {
case QUARTER:
count *= 3;
//$FALL-THROUGH$
case YEAR:
case MONTH: {
if (!withDate) {
throw DbException.getInvalidValueException("DATEADD time part", part);
}
long year = DateTimeUtils.yearFromDateValue(dateValue);
long month = DateTimeUtils.monthFromDateValue(dateValue);
int day = DateTimeUtils.dayFromDateValue(dateValue);
if (field == YEAR) {
year += count;
} else {
month += count;
}
dateValue = DateTimeUtils.dateValueFromDenormalizedDate(year, month, day);
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
}
case WEEK:
case ISO_WEEK:
count *= 7;
//$FALL-THROUGH$
case DAY_OF_WEEK:
case ISO_DAY_OF_WEEK:
case DAY_OF_MONTH:
case DAY_OF_YEAR:
if (!withDate) {
throw DbException.getInvalidValueException("DATEADD time part", part);
}
dateValue = DateTimeUtils.dateValueFromAbsoluteDay(
DateTimeUtils.absoluteDayFromDateValue(dateValue) + count);
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
case HOUR:
count *= 3_600_000_000_000L;
break;
case MINUTE:
count *= 60_000_000_000L;
break;
case SECOND:
case EPOCH:
count *= 1_000_000_000;
break;
case MILLISECOND:
count *= 1_000_000;
break;
case MICROSECOND:
count *= 1_000;
break;
case NANOSECOND:
break;
case TIMEZONE_HOUR:
count *= 60;
//$FALL-THROUGH$
case TIMEZONE_MINUTE: {
if (!(v instanceof ValueTimestampTimeZone)) {
throw DbException.getUnsupportedException("DATEADD " + part);
}
count += ((ValueTimestampTimeZone) v).getTimeZoneOffsetMins();
return ValueTimestampTimeZone.fromDateValueAndNanos(dateValue, timeNanos, (short) count);
}
default:
throw DbException.getUnsupportedException("DATEADD " + part);
}
if (!withTime) {
// Treat date as timestamp at the start of this date
forceTimestamp = true;
}
timeNanos += count;
if (timeNanos >= DateTimeUtils.NANOS_PER_DAY || timeNanos < 0) {
long d;
if (timeNanos >= DateTimeUtils.NANOS_PER_DAY) {
d = timeNanos / DateTimeUtils.NANOS_PER_DAY;
} else {
d = (timeNanos - DateTimeUtils.NANOS_PER_DAY + 1) / DateTimeUtils.NANOS_PER_DAY;
}
timeNanos -= d * DateTimeUtils.NANOS_PER_DAY;
return DateTimeUtils.dateTimeToValue(v,
DateTimeUtils.dateValueFromAbsoluteDay(DateTimeUtils.absoluteDayFromDateValue(dateValue) + d),
timeNanos, forceTimestamp);
}
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
}
/**
* Calculate the number of crossed unit boundaries between two timestamps.
* This method is supported for MS SQL Server compatibility.
* <pre>
* DATEDIFF(YEAR, '2004-12-31', '2005-01-01') = 1
* </pre>
*
* @param part the part
* @param v1 the first date-time value
* @param v2 the second date-time value
* @return the number of crossed boundaries
*/
private static long datediff(String part, Value v1, Value v2) {
int field = getDatePart(part);
long[] a1 = DateTimeUtils.dateAndTimeFromValue(v1);
long dateValue1 = a1[0];
long absolute1 = DateTimeUtils.absoluteDayFromDateValue(dateValue1);
long[] a2 = DateTimeUtils.dateAndTimeFromValue(v2);
long dateValue2 = a2[0];
long absolute2 = DateTimeUtils.absoluteDayFromDateValue(dateValue2);
switch (field) {
case NANOSECOND:
case MICROSECOND:
case MILLISECOND:
case SECOND:
case EPOCH:
case MINUTE:
case HOUR:
long timeNanos1 = a1[1];
long timeNanos2 = a2[1];
switch (field) {
case NANOSECOND:
return (absolute2 - absolute1) * DateTimeUtils.NANOS_PER_DAY
+ (timeNanos2 - timeNanos1);
case MICROSECOND:
return (absolute2 - absolute1) * (DateTimeUtils.MILLIS_PER_DAY * 1_000)
+ (timeNanos2 / 1_000 - timeNanos1 / 1_000);
case MILLISECOND:
return (absolute2 - absolute1) * DateTimeUtils.MILLIS_PER_DAY
+ (timeNanos2 / 1_000_000 - timeNanos1 / 1_000_000);
case SECOND:
case EPOCH:
return (absolute2 - absolute1) * 86_400
+ (timeNanos2 / 1_000_000_000 - timeNanos1 / 1_000_000_000);
case MINUTE:
return (absolute2 - absolute1) * 1_440
+ (timeNanos2 / 60_000_000_000L - timeNanos1 / 60_000_000_000L);
case HOUR:
return (absolute2 - absolute1) * 24
+ (timeNanos2 / 3_600_000_000_000L - timeNanos1 / 3_600_000_000_000L);
}
// Fake fall-through
//$FALL-THROUGH$
case DAY_OF_MONTH:
case DAY_OF_YEAR:
case DAY_OF_WEEK:
case ISO_DAY_OF_WEEK:
return absolute2 - absolute1;
case WEEK:
return weekdiff(absolute1, absolute2, 0);
case ISO_WEEK:
return weekdiff(absolute1, absolute2, 1);
case MONTH:
return (DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1)) * 12
+ DateTimeUtils.monthFromDateValue(dateValue2) - DateTimeUtils.monthFromDateValue(dateValue1);
case QUARTER:
return (DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1)) * 4
+ (DateTimeUtils.monthFromDateValue(dateValue2) - 1) / 3
- (DateTimeUtils.monthFromDateValue(dateValue1) - 1) / 3;
case YEAR:
return DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1);
case TIMEZONE_HOUR:
case TIMEZONE_MINUTE: {
int offsetMinutes1;
if (v1 instanceof ValueTimestampTimeZone) {
offsetMinutes1 = ((ValueTimestampTimeZone) v1).getTimeZoneOffsetMins();
} else {
offsetMinutes1 = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue1, a1[1]);
}
int offsetMinutes2;
if (v2 instanceof ValueTimestampTimeZone) {
offsetMinutes2 = ((ValueTimestampTimeZone) v2).getTimeZoneOffsetMins();
} else {
offsetMinutes2 = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue2, a2[1]);
}
if (field == TIMEZONE_HOUR) {
return (offsetMinutes2 / 60) - (offsetMinutes1 / 60);
} else {
return offsetMinutes2 - offsetMinutes1;
}
}
default:
throw DbException.getUnsupportedException("DATEDIFF " + part);
}
}
private static long weekdiff(long absolute1, long absolute2, int firstDayOfWeek) {
absolute1 += 4 - firstDayOfWeek;
long r1 = absolute1 / 7;
if (absolute1 < 0 && (r1 * 7 != absolute1)) {
r1--;
}
absolute2 += 4 - firstDayOfWeek;
long r2 = absolute2 / 7;
if (absolute2 < 0 && (r2 * 7 != absolute2)) {
r2--;
}
return r2 - r1;
}
private static String[] getMonthsAndWeeks(int field) {
String[][] result = MONTHS_AND_WEEKS;
if (result == null) {
result = new String[2][];
DateFormatSymbols dfs = DateFormatSymbols.getInstance(Locale.ENGLISH);
result[0] = dfs.getMonths();
result[1] = dfs.getWeekdays();
MONTHS_AND_WEEKS = result;
}
return result[field];
}
private static String substring(String s, int start, int length) {
int len = s.length();
start--;
......@@ -2920,69 +2573,6 @@ public class Function extends Expression implements FunctionCall {
}
}
/**
* Get the specified field of a date, however with years normalized to
* positive or negative, and month starting with 1.
*
* @param date the date value
* @param field the field type, see {@link Function} for constants
* @return the value
*/
public static int getIntDatePart(Value date, int field) {
long[] a = DateTimeUtils.dateAndTimeFromValue(date);
long dateValue = a[0];
long timeNanos = a[1];
switch (field) {
case YEAR:
return DateTimeUtils.yearFromDateValue(dateValue);
case MONTH:
return DateTimeUtils.monthFromDateValue(dateValue);
case DAY_OF_MONTH:
return DateTimeUtils.dayFromDateValue(dateValue);
case HOUR:
return (int) (timeNanos / 3_600_000_000_000L % 24);
case MINUTE:
return (int) (timeNanos / 60_000_000_000L % 60);
case SECOND:
return (int) (timeNanos / 1_000_000_000 % 60);
case MILLISECOND:
return (int) (timeNanos / 1_000_000 % 1_000);
case MICROSECOND:
return (int) (timeNanos / 1_000 % 1_000_000);
case NANOSECOND:
return (int) (timeNanos % 1_000_000_000);
case DAY_OF_YEAR:
return DateTimeUtils.getDayOfYear(dateValue);
case DAY_OF_WEEK:
return DateTimeUtils.getSundayDayOfWeek(dateValue);
case WEEK:
GregorianCalendar gc = DateTimeUtils.getCalendar();
return DateTimeUtils.getWeekOfYear(dateValue, gc.getFirstDayOfWeek() - 1, gc.getMinimalDaysInFirstWeek());
case QUARTER:
return (DateTimeUtils.monthFromDateValue(dateValue) - 1) / 3 + 1;
case ISO_YEAR:
return DateTimeUtils.getIsoWeekYear(dateValue);
case ISO_WEEK:
return DateTimeUtils.getIsoWeekOfYear(dateValue);
case ISO_DAY_OF_WEEK:
return DateTimeUtils.getIsoDayOfWeek(dateValue);
case TIMEZONE_HOUR:
case TIMEZONE_MINUTE: {
int offsetMinutes;
if (date instanceof ValueTimestampTimeZone) {
offsetMinutes = ((ValueTimestampTimeZone) date).getTimeZoneOffsetMins();
} else {
offsetMinutes = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue, timeNanos);
}
if (field == TIMEZONE_HOUR) {
return offsetMinutes / 60;
}
return offsetMinutes % 60;
}
}
throw DbException.getUnsupportedException("getDatePart(" + date + ", " + field + ')');
}
@Override
public Expression[] getArgs() {
return args;
......
......@@ -12,7 +12,9 @@ import org.h2.result.SearchRow;
import org.h2.value.Value;
/**
* Abstract function cursor.
* Abstract function cursor. This implementation filters the rows (only returns
* entries that are larger or equal to "first", and smaller than last or equal
* to "last").
*/
abstract class AbstractFunctionCursor implements Cursor {
private final FunctionIndex index;
......@@ -85,6 +87,11 @@ abstract class AbstractFunctionCursor implements Cursor {
return false;
}
/**
* Skip to the next row if one is available. This method does not filter.
*
* @return true if another row is available
*/
abstract boolean nextImpl();
@Override
......
......@@ -1329,6 +1329,7 @@ public class JdbcPreparedStatement extends JdbcStatement implements
Value[] set = new Value[size];
for (int i = 0; i < size; i++) {
ParameterInterface param = parameters.get(i);
param.checkSet();
Value value = param.getParamValue();
set[i] = value;
}
......
......@@ -354,7 +354,7 @@ public final class MVStore {
int kb = DataUtils.getConfigParam(config, "autoCommitBufferSize", 1024);
// 19 KB memory is about 1 KB storage
autoCommitMemory = kb * 1024 * 19;
autoCompactFillRate = DataUtils.getConfigParam(config, "autoCompactFillRate", 50);
autoCompactFillRate = DataUtils.getConfigParam(config, "autoCompactFillRate", 40);
char[] encryptionKey = (char[]) config.get("encryptionKey");
try {
if (!fileStoreIsProvided) {
......@@ -1083,14 +1083,7 @@ public final class MVStore {
private void storeNow() {
assert Thread.holdsLock(this);
long time = getTimeSinceCreation();
int freeDelay = retentionTime / 10;
if (time >= lastFreeUnusedChunks + freeDelay) {
// set early in case it fails (out of memory or so)
lastFreeUnusedChunks = time;
freeUnusedChunks();
// set it here as well, to avoid calling it often if it was slow
lastFreeUnusedChunks = getTimeSinceCreation();
}
freeUnusedIfNeeded(time);
int currentUnsavedPageCount = unsavedMemory;
long storeVersion = currentStoreVersion;
long version = ++currentVersion;
......@@ -1125,7 +1118,6 @@ public final class MVStore {
}
}
Chunk c = new Chunk(newChunkId);
c.pageCount = Integer.MAX_VALUE;
c.pageCountLive = Integer.MAX_VALUE;
c.maxLen = Long.MAX_VALUE;
......@@ -1280,6 +1272,21 @@ public final class MVStore {
lastStoredVersion = storeVersion;
}
/**
* Try to free unused chunks. This method doesn't directly write, but can
* change the metadata, and therefore cause a background write.
*/
private void freeUnusedIfNeeded(long time) {
int freeDelay = retentionTime / 5;
if (time >= lastFreeUnusedChunks + freeDelay) {
// set early in case it fails (out of memory or so)
lastFreeUnusedChunks = time;
freeUnusedChunks();
// set it here as well, to avoid calling it often if it was slow
lastFreeUnusedChunks = getTimeSinceCreation();
}
}
private synchronized void freeUnusedChunks() {
if (lastChunk != null && reuseSpace) {
Set<Integer> referenced = collectReferencedChunks();
......@@ -1574,11 +1581,11 @@ public final class MVStore {
*/
private long getFileLengthInUse() {
long result = fileStore.getFileLengthInUse();
assert result == _getFileLengthInUse() : result + " != " + _getFileLengthInUse();
assert result == measureFileLengthInUse() : result + " != " + measureFileLengthInUse();
return result;
}
private long _getFileLengthInUse() {
private long measureFileLengthInUse() {
long size = 2;
for (Chunk c : chunks.values()) {
if (c.len != Integer.MAX_VALUE) {
......@@ -1842,6 +1849,39 @@ public final class MVStore {
}
}
/**
* Get the current fill rate (percentage of used space in the file). Unlike
* the fill rate of the store, here we only account for chunk data; the fill
* rate here is how much of the chunk data is live (still referenced). Young
* chunks are considered live.
*
* @return the fill rate, in percent (100 is completely full)
*/
public int getCurrentFillRate() {
long maxLengthSum = 1;
long maxLengthLiveSum = 1;
long time = getTimeSinceCreation();
for (Chunk c : chunks.values()) {
maxLengthSum += c.maxLen;
if (c.time + retentionTime > time) {
// young chunks (we don't optimize those):
// assume if they are fully live
// so that we don't try to optimize yet
// until they get old
maxLengthLiveSum += c.maxLen;
} else {
maxLengthLiveSum += c.maxLenLive;
}
}
// the fill rate of all chunks combined
if (maxLengthSum <= 0) {
// avoid division by 0
maxLengthSum = 1;
}
int fillRate = (int) (100 * maxLengthLiveSum / maxLengthSum);
return fillRate;
}
private ArrayList<Chunk> findOldChunks(int targetFillRate, int write) {
if (lastChunk == null) {
// nothing to do
......@@ -2543,10 +2583,8 @@ public final class MVStore {
fileOps = false;
}
// use a lower fill rate if there were any file operations
int fillRate = fileOps ? autoCompactFillRate / 3 : autoCompactFillRate;
// TODO how to avoid endless compaction if there is a bug
// in the bookkeeping?
compact(fillRate, autoCommitMemory);
int targetFillRate = fileOps ? autoCompactFillRate / 3 : autoCompactFillRate;
compact(targetFillRate, autoCommitMemory);
autoCompactLastFileOpCount = fileStore.getWriteCount() + fileStore.getReadCount();
}
} catch (Throwable e) {
......@@ -2915,7 +2953,7 @@ public final class MVStore {
* this value, then chunks at the end of the file are moved. Compaction
* stops if the target fill rate is reached.
* <p>
* The default value is 50 (50%). The value 0 disables auto-compacting.
* The default value is 40 (40%). The value 0 disables auto-compacting.
* <p>
*
* @param percent the target fill rate
......
......@@ -96,6 +96,7 @@ public class StreamStore {
* @param in the stream
* @return the id (potentially an empty array)
*/
@SuppressWarnings("resource")
public byte[] put(InputStream in) throws IOException {
ByteArrayOutputStream id = new ByteArrayOutputStream();
int level = 0;
......
/*
* Copyright 2004-2018 H2 Group. Multiple-Licensed under the MPL 2.0,
* and the EPL 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.util;
import static org.h2.expression.Function.DAY_OF_MONTH;
import static org.h2.expression.Function.DAY_OF_WEEK;
import static org.h2.expression.Function.DAY_OF_YEAR;
import static org.h2.expression.Function.EPOCH;
import static org.h2.expression.Function.HOUR;
import static org.h2.expression.Function.ISO_DAY_OF_WEEK;
import static org.h2.expression.Function.ISO_WEEK;
import static org.h2.expression.Function.ISO_YEAR;
import static org.h2.expression.Function.MICROSECOND;
import static org.h2.expression.Function.MILLISECOND;
import static org.h2.expression.Function.MINUTE;
import static org.h2.expression.Function.MONTH;
import static org.h2.expression.Function.NANOSECOND;
import static org.h2.expression.Function.QUARTER;
import static org.h2.expression.Function.SECOND;
import static org.h2.expression.Function.TIMEZONE_HOUR;
import static org.h2.expression.Function.TIMEZONE_MINUTE;
import static org.h2.expression.Function.WEEK;
import static org.h2.expression.Function.YEAR;
import java.math.BigDecimal;
import java.text.DateFormatSymbols;
import java.text.SimpleDateFormat;
import java.util.GregorianCalendar;
import java.util.HashMap;
import java.util.Locale;
import java.util.TimeZone;
import org.h2.api.ErrorCode;
import org.h2.expression.Function;
import org.h2.message.DbException;
import org.h2.value.Value;
import org.h2.value.ValueDate;
import org.h2.value.ValueDecimal;
import org.h2.value.ValueInt;
import org.h2.value.ValueTime;
import org.h2.value.ValueTimestampTimeZone;
/**
* Date and time functions.
*/
public final class DateTimeFunctions {
private static final HashMap<String, Integer> DATE_PART = new HashMap<>();
/**
* English names of months and week days.
*/
private static volatile String[][] MONTHS_AND_WEEKS;
static {
// DATE_PART
DATE_PART.put("SQL_TSI_YEAR", YEAR);
DATE_PART.put("YEAR", YEAR);
DATE_PART.put("YYYY", YEAR);
DATE_PART.put("YY", YEAR);
DATE_PART.put("SQL_TSI_MONTH", MONTH);
DATE_PART.put("MONTH", MONTH);
DATE_PART.put("MM", MONTH);
DATE_PART.put("M", MONTH);
DATE_PART.put("QUARTER", QUARTER);
DATE_PART.put("SQL_TSI_WEEK", WEEK);
DATE_PART.put("WW", WEEK);
DATE_PART.put("WK", WEEK);
DATE_PART.put("WEEK", WEEK);
DATE_PART.put("ISO_WEEK", ISO_WEEK);
DATE_PART.put("DAY", DAY_OF_MONTH);
DATE_PART.put("DD", DAY_OF_MONTH);
DATE_PART.put("D", DAY_OF_MONTH);
DATE_PART.put("SQL_TSI_DAY", DAY_OF_MONTH);
DATE_PART.put("DAY_OF_WEEK", DAY_OF_WEEK);
DATE_PART.put("DAYOFWEEK", DAY_OF_WEEK);
DATE_PART.put("DOW", DAY_OF_WEEK);
DATE_PART.put("ISO_DAY_OF_WEEK", ISO_DAY_OF_WEEK);
DATE_PART.put("DAYOFYEAR", DAY_OF_YEAR);
DATE_PART.put("DAY_OF_YEAR", DAY_OF_YEAR);
DATE_PART.put("DY", DAY_OF_YEAR);
DATE_PART.put("DOY", DAY_OF_YEAR);
DATE_PART.put("SQL_TSI_HOUR", HOUR);
DATE_PART.put("HOUR", HOUR);
DATE_PART.put("HH", HOUR);
DATE_PART.put("SQL_TSI_MINUTE", MINUTE);
DATE_PART.put("MINUTE", MINUTE);
DATE_PART.put("MI", MINUTE);
DATE_PART.put("N", MINUTE);
DATE_PART.put("SQL_TSI_SECOND", SECOND);
DATE_PART.put("SECOND", SECOND);
DATE_PART.put("SS", SECOND);
DATE_PART.put("S", SECOND);
DATE_PART.put("MILLISECOND", MILLISECOND);
DATE_PART.put("MS", MILLISECOND);
DATE_PART.put("EPOCH", EPOCH);
DATE_PART.put("MICROSECOND", MICROSECOND);
DATE_PART.put("MCS", MICROSECOND);
DATE_PART.put("NANOSECOND", NANOSECOND);
DATE_PART.put("NS", NANOSECOND);
DATE_PART.put("TIMEZONE_HOUR", TIMEZONE_HOUR);
DATE_PART.put("TIMEZONE_MINUTE", TIMEZONE_MINUTE);
}
/**
* DATEADD function.
*
* @param part
* name of date-time part
* @param count
* count to add
* @param v
* value to add to
* @return result
*/
public static Value dateadd(String part, long count, Value v) {
int field = getDatePart(part);
if (field != MILLISECOND && field != MICROSECOND && field != NANOSECOND
&& (count > Integer.MAX_VALUE || count < Integer.MIN_VALUE)) {
throw DbException.getInvalidValueException("DATEADD count", count);
}
boolean withDate = !(v instanceof ValueTime);
boolean withTime = !(v instanceof ValueDate);
boolean forceTimestamp = false;
long[] a = DateTimeUtils.dateAndTimeFromValue(v);
long dateValue = a[0];
long timeNanos = a[1];
switch (field) {
case QUARTER:
count *= 3;
//$FALL-THROUGH$
case YEAR:
case MONTH: {
if (!withDate) {
throw DbException.getInvalidValueException("DATEADD time part", part);
}
long year = DateTimeUtils.yearFromDateValue(dateValue);
long month = DateTimeUtils.monthFromDateValue(dateValue);
int day = DateTimeUtils.dayFromDateValue(dateValue);
if (field == YEAR) {
year += count;
} else {
month += count;
}
dateValue = DateTimeUtils.dateValueFromDenormalizedDate(year, month, day);
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
}
case WEEK:
case ISO_WEEK:
count *= 7;
//$FALL-THROUGH$
case DAY_OF_WEEK:
case ISO_DAY_OF_WEEK:
case DAY_OF_MONTH:
case DAY_OF_YEAR:
if (!withDate) {
throw DbException.getInvalidValueException("DATEADD time part", part);
}
dateValue = DateTimeUtils
.dateValueFromAbsoluteDay(DateTimeUtils.absoluteDayFromDateValue(dateValue) + count);
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
case HOUR:
count *= 3_600_000_000_000L;
break;
case MINUTE:
count *= 60_000_000_000L;
break;
case SECOND:
case EPOCH:
count *= 1_000_000_000;
break;
case MILLISECOND:
count *= 1_000_000;
break;
case MICROSECOND:
count *= 1_000;
break;
case NANOSECOND:
break;
case TIMEZONE_HOUR:
count *= 60;
//$FALL-THROUGH$
case TIMEZONE_MINUTE: {
if (!(v instanceof ValueTimestampTimeZone)) {
throw DbException.getUnsupportedException("DATEADD " + part);
}
count += ((ValueTimestampTimeZone) v).getTimeZoneOffsetMins();
return ValueTimestampTimeZone.fromDateValueAndNanos(dateValue, timeNanos, (short) count);
}
default:
throw DbException.getUnsupportedException("DATEADD " + part);
}
if (!withTime) {
// Treat date as timestamp at the start of this date
forceTimestamp = true;
}
timeNanos += count;
if (timeNanos >= DateTimeUtils.NANOS_PER_DAY || timeNanos < 0) {
long d;
if (timeNanos >= DateTimeUtils.NANOS_PER_DAY) {
d = timeNanos / DateTimeUtils.NANOS_PER_DAY;
} else {
d = (timeNanos - DateTimeUtils.NANOS_PER_DAY + 1) / DateTimeUtils.NANOS_PER_DAY;
}
timeNanos -= d * DateTimeUtils.NANOS_PER_DAY;
return DateTimeUtils.dateTimeToValue(v,
DateTimeUtils.dateValueFromAbsoluteDay(DateTimeUtils.absoluteDayFromDateValue(dateValue) + d),
timeNanos, forceTimestamp);
}
return DateTimeUtils.dateTimeToValue(v, dateValue, timeNanos, forceTimestamp);
}
/**
* Calculate the number of crossed unit boundaries between two timestamps. This
* method is supported for MS SQL Server compatibility.
*
* <pre>
* DATEDIFF(YEAR, '2004-12-31', '2005-01-01') = 1
* </pre>
*
* @param part
* the part
* @param v1
* the first date-time value
* @param v2
* the second date-time value
* @return the number of crossed boundaries
*/
public static long datediff(String part, Value v1, Value v2) {
int field = getDatePart(part);
long[] a1 = DateTimeUtils.dateAndTimeFromValue(v1);
long dateValue1 = a1[0];
long absolute1 = DateTimeUtils.absoluteDayFromDateValue(dateValue1);
long[] a2 = DateTimeUtils.dateAndTimeFromValue(v2);
long dateValue2 = a2[0];
long absolute2 = DateTimeUtils.absoluteDayFromDateValue(dateValue2);
switch (field) {
case NANOSECOND:
case MICROSECOND:
case MILLISECOND:
case SECOND:
case EPOCH:
case MINUTE:
case HOUR:
long timeNanos1 = a1[1];
long timeNanos2 = a2[1];
switch (field) {
case NANOSECOND:
return (absolute2 - absolute1) * DateTimeUtils.NANOS_PER_DAY + (timeNanos2 - timeNanos1);
case MICROSECOND:
return (absolute2 - absolute1) * (DateTimeUtils.MILLIS_PER_DAY * 1_000)
+ (timeNanos2 / 1_000 - timeNanos1 / 1_000);
case MILLISECOND:
return (absolute2 - absolute1) * DateTimeUtils.MILLIS_PER_DAY
+ (timeNanos2 / 1_000_000 - timeNanos1 / 1_000_000);
case SECOND:
case EPOCH:
return (absolute2 - absolute1) * 86_400 + (timeNanos2 / 1_000_000_000 - timeNanos1 / 1_000_000_000);
case MINUTE:
return (absolute2 - absolute1) * 1_440 + (timeNanos2 / 60_000_000_000L - timeNanos1 / 60_000_000_000L);
case HOUR:
return (absolute2 - absolute1) * 24
+ (timeNanos2 / 3_600_000_000_000L - timeNanos1 / 3_600_000_000_000L);
}
// Fake fall-through
// $FALL-THROUGH$
case DAY_OF_MONTH:
case DAY_OF_YEAR:
case DAY_OF_WEEK:
case ISO_DAY_OF_WEEK:
return absolute2 - absolute1;
case WEEK:
return weekdiff(absolute1, absolute2, 0);
case ISO_WEEK:
return weekdiff(absolute1, absolute2, 1);
case MONTH:
return (DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1)) * 12
+ DateTimeUtils.monthFromDateValue(dateValue2) - DateTimeUtils.monthFromDateValue(dateValue1);
case QUARTER:
return (DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1)) * 4
+ (DateTimeUtils.monthFromDateValue(dateValue2) - 1) / 3
- (DateTimeUtils.monthFromDateValue(dateValue1) - 1) / 3;
case YEAR:
return DateTimeUtils.yearFromDateValue(dateValue2) - DateTimeUtils.yearFromDateValue(dateValue1);
case TIMEZONE_HOUR:
case TIMEZONE_MINUTE: {
int offsetMinutes1;
if (v1 instanceof ValueTimestampTimeZone) {
offsetMinutes1 = ((ValueTimestampTimeZone) v1).getTimeZoneOffsetMins();
} else {
offsetMinutes1 = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue1, a1[1]);
}
int offsetMinutes2;
if (v2 instanceof ValueTimestampTimeZone) {
offsetMinutes2 = ((ValueTimestampTimeZone) v2).getTimeZoneOffsetMins();
} else {
offsetMinutes2 = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue2, a2[1]);
}
if (field == TIMEZONE_HOUR) {
return (offsetMinutes2 / 60) - (offsetMinutes1 / 60);
} else {
return offsetMinutes2 - offsetMinutes1;
}
}
default:
throw DbException.getUnsupportedException("DATEDIFF " + part);
}
}
/**
* Extracts specified field from the specified date-time value.
*
* @param part
* the date part
* @param value
* the date-time value
* @return extracted field
*/
public static Value extract(String part, Value value) {
Value result;
int field = getDatePart(part);
if (field != EPOCH) {
result = ValueInt.get(getIntDatePart(value, field));
} else {
// Case where we retrieve the EPOCH time.
// First we retrieve the dateValue and his time in nanoseconds.
long[] a = DateTimeUtils.dateAndTimeFromValue(value);
long dateValue = a[0];
long timeNanos = a[1];
// We compute the time in nanoseconds and the total number of days.
BigDecimal timeNanosBigDecimal = new BigDecimal(timeNanos);
BigDecimal numberOfDays = new BigDecimal(DateTimeUtils.absoluteDayFromDateValue(dateValue));
BigDecimal nanosSeconds = new BigDecimal(1_000_000_000);
BigDecimal secondsPerDay = new BigDecimal(DateTimeUtils.SECONDS_PER_DAY);
// Case where the value is of type time e.g. '10:00:00'
if (value instanceof ValueTime) {
// In order to retrieve the EPOCH time we only have to convert the time
// in nanoseconds (previously retrieved) in seconds.
result = ValueDecimal.get(timeNanosBigDecimal.divide(nanosSeconds));
} else if (value instanceof ValueDate) {
// Case where the value is of type date '2000:01:01', we have to retrieve the
// total number of days and multiply it by the number of seconds in a day.
result = ValueDecimal.get(numberOfDays.multiply(secondsPerDay));
} else if (value instanceof ValueTimestampTimeZone) {
// Case where the value is a of type ValueTimestampTimeZone
// ('2000:01:01 10:00:00+05').
// We retrieve the time zone offset in minutes
ValueTimestampTimeZone v = (ValueTimestampTimeZone) value;
BigDecimal timeZoneOffsetSeconds = new BigDecimal(v.getTimeZoneOffsetMins() * 60);
// Sum the time in nanoseconds and the total number of days in seconds
// and adding the timeZone offset in seconds.
result = ValueDecimal.get(timeNanosBigDecimal.divide(nanosSeconds)
.add(numberOfDays.multiply(secondsPerDay)).subtract(timeZoneOffsetSeconds));
} else {
// By default, we have the date and the time ('2000:01:01 10:00:00') if no type
// is given.
// We just have to sum the time in nanoseconds and the total number of days in
// seconds.
result = ValueDecimal
.get(timeNanosBigDecimal.divide(nanosSeconds).add(numberOfDays.multiply(secondsPerDay)));
}
}
return result;
}
/**
* Formats a date using a format string.
*
* @param date
* the date to format
* @param format
* the format string
* @param locale
* the locale
* @param timeZone
* the timezone
* @return the formatted date
*/
public static String formatDateTime(java.util.Date date, String format, String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
synchronized (dateFormat) {
return dateFormat.format(date);
}
}
private static SimpleDateFormat getDateFormat(String format, String locale, String timeZone) {
try {
// currently, a new instance is create for each call
// however, could cache the last few instances
SimpleDateFormat df;
if (locale == null) {
df = new SimpleDateFormat(format);
} else {
Locale l = new Locale(locale);
df = new SimpleDateFormat(format, l);
}
if (timeZone != null) {
df.setTimeZone(TimeZone.getTimeZone(timeZone));
}
return df;
} catch (Exception e) {
throw DbException.get(ErrorCode.PARSE_ERROR_1, e, format + "/" + locale + "/" + timeZone);
}
}
private static int getDatePart(String part) {
Integer p = DATE_PART.get(StringUtils.toUpperEnglish(part));
if (p == null) {
throw DbException.getInvalidValueException("date part", part);
}
return p.intValue();
}
/**
* Get the specified field of a date, however with years normalized to positive
* or negative, and month starting with 1.
*
* @param date
* the date value
* @param field
* the field type, see {@link Function} for constants
* @return the value
*/
public static int getIntDatePart(Value date, int field) {
long[] a = DateTimeUtils.dateAndTimeFromValue(date);
long dateValue = a[0];
long timeNanos = a[1];
switch (field) {
case YEAR:
return DateTimeUtils.yearFromDateValue(dateValue);
case MONTH:
return DateTimeUtils.monthFromDateValue(dateValue);
case DAY_OF_MONTH:
return DateTimeUtils.dayFromDateValue(dateValue);
case HOUR:
return (int) (timeNanos / 3_600_000_000_000L % 24);
case MINUTE:
return (int) (timeNanos / 60_000_000_000L % 60);
case SECOND:
return (int) (timeNanos / 1_000_000_000 % 60);
case MILLISECOND:
return (int) (timeNanos / 1_000_000 % 1_000);
case MICROSECOND:
return (int) (timeNanos / 1_000 % 1_000_000);
case NANOSECOND:
return (int) (timeNanos % 1_000_000_000);
case DAY_OF_YEAR:
return DateTimeUtils.getDayOfYear(dateValue);
case DAY_OF_WEEK:
return DateTimeUtils.getSundayDayOfWeek(dateValue);
case WEEK:
GregorianCalendar gc = DateTimeUtils.getCalendar();
return DateTimeUtils.getWeekOfYear(dateValue, gc.getFirstDayOfWeek() - 1, gc.getMinimalDaysInFirstWeek());
case QUARTER:
return (DateTimeUtils.monthFromDateValue(dateValue) - 1) / 3 + 1;
case ISO_YEAR:
return DateTimeUtils.getIsoWeekYear(dateValue);
case ISO_WEEK:
return DateTimeUtils.getIsoWeekOfYear(dateValue);
case ISO_DAY_OF_WEEK:
return DateTimeUtils.getIsoDayOfWeek(dateValue);
case TIMEZONE_HOUR:
case TIMEZONE_MINUTE: {
int offsetMinutes;
if (date instanceof ValueTimestampTimeZone) {
offsetMinutes = ((ValueTimestampTimeZone) date).getTimeZoneOffsetMins();
} else {
offsetMinutes = DateTimeUtils.getTimeZoneOffsetMillis(null, dateValue, timeNanos);
}
if (field == TIMEZONE_HOUR) {
return offsetMinutes / 60;
}
return offsetMinutes % 60;
}
}
throw DbException.getUnsupportedException("getDatePart(" + date + ", " + field + ')');
}
/**
* Return names of month or weeks.
*
* @param field
* 0 for months, 1 for weekdays
* @return names of month or weeks
*/
public static String[] getMonthsAndWeeks(int field) {
String[][] result = MONTHS_AND_WEEKS;
if (result == null) {
result = new String[2][];
DateFormatSymbols dfs = DateFormatSymbols.getInstance(Locale.ENGLISH);
result[0] = dfs.getMonths();
result[1] = dfs.getWeekdays();
MONTHS_AND_WEEKS = result;
}
return result[field];
}
/**
* Check if a given string is a valid date part string.
*
* @param part
* the string
* @return true if it is
*/
public static boolean isDatePart(String part) {
return DATE_PART.containsKey(StringUtils.toUpperEnglish(part));
}
/**
* Parses a date using a format string.
*
* @param date
* the date to parse
* @param format
* the parsing format
* @param locale
* the locale
* @param timeZone
* the timeZone
* @return the parsed date
*/
public static java.util.Date parseDateTime(String date, String format, String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
try {
synchronized (dateFormat) {
return dateFormat.parse(date);
}
} catch (Exception e) {
// ParseException
throw DbException.get(ErrorCode.PARSE_ERROR_1, e, date);
}
}
private static long weekdiff(long absolute1, long absolute2, int firstDayOfWeek) {
absolute1 += 4 - firstDayOfWeek;
long r1 = absolute1 / 7;
if (absolute1 < 0 && (r1 * 7 != absolute1)) {
r1--;
}
absolute2 += 4 - firstDayOfWeek;
long r2 = absolute2 / 7;
if (absolute2 < 0 && (r2 * 7 != absolute2)) {
r2--;
}
return r2 - r1;
}
private DateTimeFunctions() {
}
}
......@@ -9,12 +9,9 @@ package org.h2.util;
import java.sql.Date;
import java.sql.Time;
import java.sql.Timestamp;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.GregorianCalendar;
import java.util.Locale;
import java.util.TimeZone;
import org.h2.api.ErrorCode;
import org.h2.engine.Mode;
import org.h2.message.DbException;
import org.h2.value.Value;
......@@ -58,7 +55,7 @@ public class DateTimeUtils {
/**
* Date value for 1970-01-01.
*/
private static final int EPOCH_DATE_VALUE = (1970 << SHIFT_YEAR) + (1 << SHIFT_MONTH) + 1;
public static final int EPOCH_DATE_VALUE = (1970 << SHIFT_YEAR) + (1 << SHIFT_MONTH) + 1;
private static final int[] NORMAL_DAYS_PER_MONTH = { 0, 31, 28, 31, 30, 31,
30, 31, 31, 30, 31, 30, 31 };
......@@ -744,8 +741,7 @@ public class DateTimeUtils {
* @return number of day in year
*/
public static int getDayOfYear(long dateValue) {
int year = yearFromDateValue(dateValue);
return (int) (absoluteDayFromDateValue(dateValue) - absoluteDayFromDateValue(dateValue(year, 1, 1))) + 1;
return (int) (absoluteDayFromDateValue(dateValue) - absoluteDayFromYear(yearFromDateValue(dateValue))) + 1;
}
/**
......@@ -825,7 +821,7 @@ public class DateTimeUtils {
}
private static long getWeekOfYearBase(int year, int firstDayOfWeek, int minimalDaysInFirstWeek) {
long first = absoluteDayFromDateValue(dateValue(year, 1, 1));
long first = absoluteDayFromYear(year);
int daysInFirstWeek = 8 - getDayOfWeekFromAbsolute(first, firstDayOfWeek);
long base = first + daysInFirstWeek;
if (daysInFirstWeek >= minimalDaysInFirstWeek) {
......@@ -860,67 +856,6 @@ public class DateTimeUtils {
return year;
}
/**
* Formats a date using a format string.
*
* @param date the date to format
* @param format the format string
* @param locale the locale
* @param timeZone the timezone
* @return the formatted date
*/
public static String formatDateTime(java.util.Date date, String format,
String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
synchronized (dateFormat) {
return dateFormat.format(date);
}
}
/**
* Parses a date using a format string.
*
* @param date the date to parse
* @param format the parsing format
* @param locale the locale
* @param timeZone the timeZone
* @return the parsed date
*/
public static java.util.Date parseDateTime(String date, String format,
String locale, String timeZone) {
SimpleDateFormat dateFormat = getDateFormat(format, locale, timeZone);
try {
synchronized (dateFormat) {
return dateFormat.parse(date);
}
} catch (Exception e) {
// ParseException
throw DbException.get(ErrorCode.PARSE_ERROR_1, e, date);
}
}
private static SimpleDateFormat getDateFormat(String format, String locale,
String timeZone) {
try {
// currently, a new instance is create for each call
// however, could cache the last few instances
SimpleDateFormat df;
if (locale == null) {
df = new SimpleDateFormat(format);
} else {
Locale l = new Locale(locale);
df = new SimpleDateFormat(format, l);
}
if (timeZone != null) {
df.setTimeZone(TimeZone.getTimeZone(timeZone));
}
return df;
} catch (Exception e) {
throw DbException.get(ErrorCode.PARSE_ERROR_1, e,
format + "/" + locale + "/" + timeZone);
}
}
/**
* Returns number of days in month.
*
......@@ -1230,6 +1165,26 @@ public class DateTimeUtils {
return ValueTimestampTimeZone.fromDateValueAndNanos(dateValue, timeNanos, (short) offsetMins);
}
/**
* Calculate the absolute day for a January, 1 of the specified year.
*
* @param year
* the year
* @return the absolute day
*/
public static long absoluteDayFromYear(long year) {
year--;
long a = ((year * 1461L) >> 2) - 719_177;
if (year < 1582) {
// Julian calendar
a += 13;
} else if (year < 1900 || year > 2099) {
// Gregorian calendar (slow mode)
a += (year / 400) - (year / 100) + 15;
}
return a;
}
/**
* Calculate the absolute day from an encoded date value.
*
......@@ -1244,11 +1199,11 @@ public class DateTimeUtils {
y--;
m += 12;
}
long a = ((y * 2922L) >> 3) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y <= 1582 && ((y < 1582) || (m * 100 + d < 1015))) {
long a = ((y * 1461L) >> 2) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y <= 1582 && ((y < 1582) || (m * 100 + d < 10_15))) {
// Julian calendar (cutover at 1582-10-04 / 1582-10-15)
a += 13;
} else if (y < 1901 || y > 2099) {
} else if (y < 1900 || y > 2099) {
// Gregorian calendar (slow mode)
a += (y / 400) - (y / 100) + 15;
}
......@@ -1270,8 +1225,8 @@ public class DateTimeUtils {
y--;
m += 12;
}
long a = ((y * 2922L) >> 3) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y < 1901 || y > 2099) {
long a = ((y * 1461L) >> 2) + DAYS_OFFSET[m - 3] + d - 719_484;
if (y < 1900 || y > 2099) {
// Slow mode
a += (y / 400) - (y / 100) + 15;
}
......@@ -1286,7 +1241,7 @@ public class DateTimeUtils {
*/
public static long dateValueFromAbsoluteDay(long absoluteDay) {
long d = absoluteDay + 719_468;
long y100 = 0, offset;
long y100, offset;
if (d > 578_040) {
// Gregorian calendar
long y400 = d / 146_097;
......@@ -1296,6 +1251,7 @@ public class DateTimeUtils {
offset = y400 * 400 + y100 * 100;
} else {
// Julian calendar
y100 = 0;
d += 292_200_000_002L;
offset = -800_000_000;
}
......@@ -1339,14 +1295,13 @@ public class DateTimeUtils {
if (day < getDaysInMonth(year, month)) {
return dateValue + 1;
}
day = 1;
if (month < 12) {
month++;
} else {
month = 1;
year++;
}
return dateValue(year, month, day);
return dateValue(year, month, 1);
}
/**
......
......@@ -91,8 +91,7 @@ public class ToDateParser {
}
if (doyValid) {
dateValue = DateTimeUtils.dateValueFromAbsoluteDay(
DateTimeUtils.absoluteDayFromDateValue(DateTimeUtils.dateValue(year, 1, 1))
+ dayOfYear - 1);
DateTimeUtils.absoluteDayFromYear(year) + dayOfYear - 1);
} else {
int month = this.month;
if (month == 0) {
......
......@@ -845,8 +845,7 @@ public abstract class Value {
case TIME:
// because the time has set the date to 1970-01-01,
// this will be the result
return ValueDate.fromDateValue(
DateTimeUtils.dateValue(1970, 1, 1));
return ValueDate.fromDateValue(DateTimeUtils.EPOCH_DATE_VALUE);
case TIMESTAMP:
return ValueDate.fromDateValue(
((ValueTimestamp) this).getDateValue());
......
......@@ -142,6 +142,7 @@ import org.h2.test.store.TestKillProcessWhileWriting;
import org.h2.test.store.TestMVRTree;
import org.h2.test.store.TestMVStore;
import org.h2.test.store.TestMVStoreBenchmark;
import org.h2.test.store.TestMVStoreStopCompact;
import org.h2.test.store.TestMVStoreTool;
import org.h2.test.store.TestMVTableEngine;
import org.h2.test.store.TestObjectDataType;
......@@ -889,6 +890,7 @@ kill -9 `jps -l | grep "org.h2.test." | cut -d " " -f 1`
addTest(new TestMVRTree());
addTest(new TestMVStore());
addTest(new TestMVStoreBenchmark());
addTest(new TestMVStoreStopCompact());
addTest(new TestMVStoreTool());
addTest(new TestMVTableEngine());
addTest(new TestObjectDataType());
......
......@@ -710,6 +710,11 @@ public class TestIndex extends TestBase {
trace("---done---");
}
/**
* This method is called from the database.
*
* @return the result set
*/
public static ResultSet testFunctionIndexFunction() {
// There are additional callers like JdbcConnection.prepareCommand() and
// CommandContainer.recompileIfRequired()
......
......@@ -13,6 +13,8 @@ import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import org.h2.api.ErrorCode;
import org.h2.test.TestBase;
/**
......@@ -210,6 +212,7 @@ public class TestBatchUpdates extends TestBase {
String s = COFFEE_UPDATE;
trace("Prepared Statement String:" + s);
prep = conn.prepareStatement(s);
assertThrows(ErrorCode.PARAMETER_NOT_SET_1, prep).addBatch();
prep.setInt(1, 2);
prep.addBatch();
prep.setInt(1, 3);
......
/*
* Copyright 2004-2018 H2 Group. Multiple-Licensed under the MPL 2.0,
* and the EPL 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package org.h2.test.store;
import java.util.Random;
import org.h2.mvstore.MVMap;
import org.h2.mvstore.MVStore;
import org.h2.store.fs.FileUtils;
import org.h2.test.TestBase;
/**
* Test that the MVStore eventually stops optimizing (does not excessively opti
*/
public class TestMVStoreStopCompact extends TestBase {
/**
* Run just this test.
*
* @param a ignored
*/
public static void main(String... a) throws Exception {
TestBase test = TestBase.createCaller().init();
test.config.big = true;
test.test();
}
@Override
public void test() throws Exception {
if (!config.big) {
return;
}
for(int retentionTime = 10; retentionTime < 1000; retentionTime *= 10) {
for(int timeout = 100; timeout <= 1000; timeout *= 10) {
testStopCompact(retentionTime, timeout);
}
}
}
private void testStopCompact(int retentionTime, int timeout) throws InterruptedException {
String fileName = getBaseDir() + "/testStopCompact.h3";
FileUtils.createDirectories(getBaseDir());
FileUtils.delete(fileName);
// store with a very small page size, to make sure
// there are many leaf pages
MVStore s = new MVStore.Builder().
fileName(fileName).open();
s.setRetentionTime(retentionTime);
MVMap<Integer, String> map = s.openMap("data");
long start = System.currentTimeMillis();
Random r = new Random(1);
for (int i = 0; i < 4000000; i++) {
long time = System.currentTimeMillis() - start;
if (time > timeout) {
break;
}
int x = r.nextInt(10000000);
map.put(x, "Hello World " + i * 10);
}
s.setAutoCommitDelay(100);
long oldWriteCount = s.getFileStore().getWriteCount();
// expect background write to stop after 5 seconds
Thread.sleep(5000);
long newWriteCount = s.getFileStore().getWriteCount();
// expect that compaction didn't cause many writes
assertTrue(newWriteCount - oldWriteCount < 30);
s.close();
}
}
......@@ -26,7 +26,6 @@ public class TestClearReferences extends TestBase {
"org.h2.compress.CompressLZF.cachedHashTable",
"org.h2.engine.DbSettings.defaultSettings",
"org.h2.engine.SessionRemote.sessionFactory",
"org.h2.expression.Function.MONTHS_AND_WEEKS",
"org.h2.jdbcx.JdbcDataSourceFactory.cachedTraceSystem",
"org.h2.store.RecoverTester.instance",
"org.h2.store.fs.FilePath.defaultProvider",
......@@ -37,6 +36,7 @@ public class TestClearReferences extends TestBase {
"org.h2.tools.CompressTool.cachedBuffer",
"org.h2.util.CloseWatcher.queue",
"org.h2.util.CloseWatcher.refs",
"org.h2.util.DateTimeFunctions.MONTHS_AND_WEEKS",
"org.h2.util.DateTimeUtils.timeZone",
"org.h2.util.MathUtils.cachedSecureRandom",
"org.h2.util.NetUtils.cachedLocalAddress",
......
......@@ -371,6 +371,9 @@ public class TestDate extends TestBase {
if (abs != next && next != Long.MIN_VALUE) {
assertEquals(abs, next);
}
if (m == 1 && d == 1) {
assertEquals(abs, DateTimeUtils.absoluteDayFromYear(y));
}
next = abs + 1;
long d2 = DateTimeUtils.dateValueFromAbsoluteDay(abs);
assertEquals(date, d2);
......
......@@ -13,7 +13,7 @@ import java.util.Random;
import org.h2.message.DbException;
import org.h2.test.TestBase;
import org.h2.test.utils.AssertThrows;
import org.h2.util.DateTimeUtils;
import org.h2.util.DateTimeFunctions;
import org.h2.util.StringUtils;
/**
......@@ -85,7 +85,7 @@ public class TestStringUtils extends TestBase {
StringUtils.xmlText("Rand&Blue"));
assertEquals("&lt;&lt;[[[]]]&gt;&gt;",
StringUtils.xmlCData("<<[[[]]]>>"));
Date dt = DateTimeUtils.parseDateTime(
Date dt = DateTimeFunctions.parseDateTime(
"2001-02-03 04:05:06 GMT",
"yyyy-MM-dd HH:mm:ss z", "en", "GMT");
String s = StringUtils.xmlStartDoc()
......@@ -99,10 +99,10 @@ public class TestStringUtils extends TestBase {
+ StringUtils.xmlNode("description", null, "H2 Database Engine")
+ StringUtils.xmlNode("language", null, "en-us")
+ StringUtils.xmlNode("pubDate", null,
DateTimeUtils.formatDateTime(dt,
DateTimeFunctions.formatDateTime(dt,
"EEE, d MMM yyyy HH:mm:ss z", "en", "GMT"))
+ StringUtils.xmlNode("lastBuildDate", null,
DateTimeUtils.formatDateTime(dt,
DateTimeFunctions.formatDateTime(dt,
"EEE, d MMM yyyy HH:mm:ss z", "en", "GMT"))
+ StringUtils.xmlNode("item", null,
StringUtils.xmlNode("title", null,
......
......@@ -767,3 +767,8 @@ interpolated thead
die weekdiff osx subprocess dow proleptic microsecond microseconds divisible cmp denormalized suppressed saturated mcs
london dfs weekdays intermittent looked msec tstz africa monrovia asia tokyo weekday joi callers multipliers ucn
openoffice organize libre systemtables gmane sea borders announced millennium alex nordlund rarely
opti excessively
iterators tech enums incompatibilities loses reimplement readme reorganize
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论