Skip to content
项目
群组
代码片段
帮助
正在加载...
帮助
为 GitLab 提交贡献
登录/注册
切换导航
H
h2database
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
分枝图
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
计划
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
分枝图
统计图
创建新议题
作业
提交
议题看板
打开侧边栏
Administrator
h2database
Commits
92091718
Unverified
提交
92091718
authored
1月 29, 2019
作者:
Evgenij Ryazanov
提交者:
GitHub
1月 29, 2019
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #1701 from katzyn/percentile
Add PERCENTILE_CONT and PERCENTILE_DISC inverse distribution functions
上级
d5c9eff7
611c885b
显示空白字符变更
内嵌
并排
正在显示
19 个修改的文件
包含
798 行增加
和
275 行删除
+798
-275
help.csv
h2/src/docsrc/help/help.csv
+34
-2
changelog.html
h2/src/docsrc/html/changelog.html
+14
-0
ErrorCode.java
h2/src/main/org/h2/api/ErrorCode.java
+1
-1
Parser.java
h2/src/main/org/h2/command/Parser.java
+40
-28
Aggregate.java
h2/src/main/org/h2/expression/aggregate/Aggregate.java
+79
-116
AggregateData.java
h2/src/main/org/h2/expression/aggregate/AggregateData.java
+2
-1
AggregateDataCollecting.java
.../org/h2/expression/aggregate/AggregateDataCollecting.java
+27
-0
AggregateDataDefault.java
...ain/org/h2/expression/aggregate/AggregateDataDefault.java
+0
-1
AggregateType.java
h2/src/main/org/h2/expression/aggregate/AggregateType.java
+128
-0
Percentile.java
h2/src/main/org/h2/expression/aggregate/Percentile.java
+139
-61
Comparison.java
h2/src/main/org/h2/expression/condition/Comparison.java
+1
-1
MVMap.java
h2/src/main/org/h2/mvstore/MVMap.java
+4
-4
MVStore.java
h2/src/main/org/h2/mvstore/MVStore.java
+2
-1
Value.java
h2/src/main/org/h2/value/Value.java
+15
-14
TestScript.java
h2/src/test/org/h2/test/scripts/TestScript.java
+1
-1
array.sql
h2/src/test/org/h2/test/scripts/datatypes/array.sql
+8
-0
mode.sql
h2/src/test/org/h2/test/scripts/functions/aggregate/mode.sql
+33
-7
percentile.sql
...st/org/h2/test/scripts/functions/aggregate/percentile.sql
+269
-36
dictionary.txt
h2/src/tools/org/h2/build/doc/dictionary.txt
+1
-1
没有找到文件。
h2/src/docsrc/help/help.csv
浏览文件 @
92091718
...
...
@@ -3674,6 +3674,38 @@ Aggregates are only allowed in select statements.
VAR_SAMP(X)
"
"Functions (Aggregate)","PERCENTILE_CONT","
PERCENTILE_CONT(numeric) WITHIN GROUP (ORDER BY value [ASC|DESC])
[FILTER (WHERE expression)] [OVER windowNameOrSpecification]
","
Return percentile of values from the group with interpolation.
Interpolation is only supported for numeric, date-time, and interval data types.
Argument must be between 0 and 1 inclusive.
Argument must be the same for all rows in the same group.
If argument is NULL, the result is NULL.
NULL values are ignored in the calculation.
If no rows are selected, the result is NULL.
Aggregates are only allowed in select statements.
","
PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY V)
"
"Functions (Aggregate)","PERCENTILE_DISC","
PERCENTILE_DISC(numeric) WITHIN GROUP (ORDER BY value [ASC|DESC])
[FILTER (WHERE expression)] [OVER windowNameOrSpecification]
","
Return percentile of values from the group.
Interpolation is not performed.
Argument must be between 0 and 1 inclusive.
Argument must be the same for all rows in the same group.
If argument is NULL, the result is NULL.
NULL values are ignored in the calculation.
If no rows are selected, the result is NULL.
Aggregates are only allowed in select statements.
","
PERCENTILE_DISC(0.5) WITHIN GROUP (ORDER BY V)
"
"Functions (Aggregate)","MEDIAN","
MEDIAN( [ DISTINCT|ALL ] value )
[FILTER (WHERE expression)] [OVER windowNameOrSpecification]
...
...
@@ -3690,7 +3722,7 @@ MEDIAN(X)
"Functions (Aggregate)","MODE","
{ MODE( value ) [ ORDER BY expression [ ASC | DESC ] ] }
| { MODE() WITHIN GROUP(ORDER BY expression [ ASC | DESC ]) }
| { MODE() WITHIN GROUP
(ORDER BY expression [ ASC | DESC ]) }
[FILTER (WHERE expression)] [OVER windowNameOrSpecification]
","
Returns the value that occurs with the greatest frequency.
...
...
@@ -3706,7 +3738,7 @@ Aggregates are only allowed in select statements.
","
MODE(X)
MODE(X ORDER BY X)
MODE() WITHIN GROUP(ORDER BY X)
MODE() WITHIN GROUP
(ORDER BY X)
"
"Functions (Aggregate)","ENVELOPE","
...
...
h2/src/docsrc/html/changelog.html
浏览文件 @
92091718
...
...
@@ -21,6 +21,20 @@ Change Log
<h2>
Next Version (unreleased)
</h2>
<ul>
<li>
PR #1701: Add PERCENTILE_CONT and PERCENTILE_DISC inverse distribution functions
</li>
<li>
Issues #1297, #1697: Failure on concurrent session closure
</li>
<li>
Issue #1297: removeOldTempIndexes on PageStore causes NullPointerException
</li>
<li>
Issue #1354: TestCrashAPI: another NPE
</li>
<li>
PR #1695: Reduce memory for TestMVTempResult to 64m
</li>
<li>
Issue #1691: Append mode causes OOME in MVPlainTempResult
</li>
<li>
PR #1692: Use MVTempResult unconditionally
</li>
<li>
Issue #1689: Use separate constants for data types in Data, ValueDataType, and Transfer
</li>
<li>
PR #1687: MVMap minor cleanup
...
...
h2/src/main/org/h2/api/ErrorCode.java
浏览文件 @
92091718
...
...
@@ -414,7 +414,7 @@ public class ErrorCode {
/**
* The error with code <code>42131</code> is thrown when
* identical expressions should be used, but different
* ex
cept
ions were found.
* ex
press
ions were found.
* Example:
* <pre>
* SELECT MODE(A ORDER BY B) FROM TEST;
...
...
h2/src/main/org/h2/command/Parser.java
浏览文件 @
92091718
...
...
@@ -177,7 +177,7 @@ import org.h2.expression.Variable;
import
org.h2.expression.Wildcard
;
import
org.h2.expression.aggregate.AbstractAggregate
;
import
org.h2.expression.aggregate.Aggregate
;
import
org.h2.expression.aggregate.Aggregate
.Aggregate
Type
;
import
org.h2.expression.aggregate.AggregateType
;
import
org.h2.expression.aggregate.JavaAggregate
;
import
org.h2.expression.analysis.DataAnalysisOperation
;
import
org.h2.expression.analysis.Window
;
...
...
@@ -3055,18 +3055,9 @@ public class Parser {
break
;
case
GROUP_CONCAT:
{
boolean
distinct
=
readDistinctAgg
();
if
(
equalsToken
(
"GROUP_CONCAT"
,
aggregateName
))
{
r
=
new
Aggregate
(
AggregateType
.
GROUP_CONCAT
,
readExpression
(),
currentSelect
,
distinct
);
if
(
readIf
(
ORDER
))
{
read
(
"BY"
);
r
.
setOrderByList
(
parseSimpleOrderList
());
}
if
(
readIf
(
"SEPARATOR"
))
{
r
.
setGroupConcatSeparator
(
readExpression
());
}
}
else
if
(
equalsToken
(
"STRING_AGG"
,
aggregateName
))
{
if
(
equalsToken
(
"STRING_AGG"
,
aggregateName
))
{
// PostgreSQL compatibility: string_agg(expression, delimiter)
r
=
new
Aggregate
(
AggregateType
.
GROUP_CONCAT
,
readExpression
(),
currentSelect
,
distinct
);
read
(
COMMA
);
r
.
setGroupConcatSeparator
(
readExpression
());
if
(
readIf
(
ORDER
))
{
...
...
@@ -3074,7 +3065,13 @@ public class Parser {
r
.
setOrderByList
(
parseSimpleOrderList
());
}
}
else
{
r
=
null
;
if
(
readIf
(
ORDER
))
{
read
(
"BY"
);
r
.
setOrderByList
(
parseSimpleOrderList
());
}
if
(
readIf
(
"SEPARATOR"
))
{
r
.
setGroupConcatSeparator
(
readExpression
());
}
}
break
;
}
...
...
@@ -3087,19 +3084,19 @@ public class Parser {
}
break
;
}
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
{
Expression
num
=
readExpression
();
read
(
CLOSE_PAREN
);
r
=
readWithinGroup
(
aggregateType
,
num
);
break
;
}
case
MODE:
{
if
(
readIf
(
CLOSE_PAREN
))
{
read
(
"WITHIN"
);
read
(
GROUP
);
read
(
OPEN_PAREN
);
read
(
ORDER
);
read
(
"BY"
);
Expression
expr
=
readExpression
();
r
=
new
Aggregate
(
AggregateType
.
MODE
,
expr
,
currentSelect
,
false
);
setModeAggOrder
(
r
,
expr
);
r
=
readWithinGroup
(
AggregateType
.
MODE
,
null
);
}
else
{
Expression
expr
=
readExpression
();
r
=
new
Aggregate
(
aggregateType
,
expr
,
currentSelect
,
false
);
r
=
new
Aggregate
(
aggregateType
,
null
,
currentSelect
,
false
);
if
(
readIf
(
ORDER
))
{
read
(
"BY"
);
Expression
expr2
=
readExpression
();
...
...
@@ -3108,7 +3105,9 @@ public class Parser {
throw
DbException
.
getSyntaxError
(
ErrorCode
.
IDENTICAL_EXPRESSIONS_SHOULD_BE_USED
,
sqlCommand
,
lastParseIndex
,
sql
,
sql2
);
}
setModeAggOrder
(
r
,
expr
);
readAggregateOrder
(
r
,
expr
,
true
);
}
else
{
readAggregateOrder
(
r
,
expr
,
false
);
}
}
break
;
...
...
@@ -3119,17 +3118,30 @@ public class Parser {
break
;
}
read
(
CLOSE_PAREN
);
if
(
r
!=
null
)
{
readFilterAndOver
(
r
);
return
r
;
}
private
Aggregate
readWithinGroup
(
AggregateType
aggregateType
,
Expression
argument
)
{
Aggregate
r
;
read
(
"WITHIN"
);
read
(
GROUP
);
read
(
OPEN_PAREN
);
read
(
ORDER
);
read
(
"BY"
);
Expression
expr
=
readExpression
();
r
=
new
Aggregate
(
aggregateType
,
argument
,
currentSelect
,
false
);
readAggregateOrder
(
r
,
expr
,
true
);
return
r
;
}
private
void
setModeAggOrder
(
Aggregate
r
,
Expression
expr
)
{
private
void
readAggregateOrder
(
Aggregate
r
,
Expression
expr
,
boolean
parseSortType
)
{
ArrayList
<
SelectOrderBy
>
orderList
=
new
ArrayList
<>(
1
);
SelectOrderBy
order
=
new
SelectOrderBy
();
order
.
expression
=
expr
;
if
(
parseSortType
)
{
order
.
sortType
=
parseSimpleSortType
();
}
orderList
.
add
(
order
);
r
.
setOrderByList
(
orderList
);
}
...
...
h2/src/main/org/h2/expression/aggregate/Aggregate.java
浏览文件 @
92091718
...
...
@@ -48,113 +48,6 @@ import org.h2.value.ValueString;
*/
public
class
Aggregate
extends
AbstractAggregate
{
public
enum
AggregateType
{
/**
* The aggregate type for COUNT(*).
*/
COUNT_ALL
,
/**
* The aggregate type for COUNT(expression).
*/
COUNT
,
/**
* The aggregate type for GROUP_CONCAT(...).
*/
GROUP_CONCAT
,
/**
* The aggregate type for SUM(expression).
*/
SUM
,
/**
* The aggregate type for MIN(expression).
*/
MIN
,
/**
* The aggregate type for MAX(expression).
*/
MAX
,
/**
* The aggregate type for AVG(expression).
*/
AVG
,
/**
* The aggregate type for STDDEV_POP(expression).
*/
STDDEV_POP
,
/**
* The aggregate type for STDDEV_SAMP(expression).
*/
STDDEV_SAMP
,
/**
* The aggregate type for VAR_POP(expression).
*/
VAR_POP
,
/**
* The aggregate type for VAR_SAMP(expression).
*/
VAR_SAMP
,
/**
* The aggregate type for ANY(expression).
*/
ANY
,
/**
* The aggregate type for EVERY(expression).
*/
EVERY
,
/**
* The aggregate type for BOOL_OR(expression).
*/
BIT_OR
,
/**
* The aggregate type for BOOL_AND(expression).
*/
BIT_AND
,
/**
* The aggregate type for SELECTIVITY(expression).
*/
SELECTIVITY
,
/**
* The aggregate type for HISTOGRAM(expression).
*/
HISTOGRAM
,
/**
* The aggregate type for MEDIAN(expression).
*/
MEDIAN
,
/**
* The aggregate type for ARRAY_AGG(expression).
*/
ARRAY_AGG
,
/**
* The aggregate type for MODE(expression).
*/
MODE
,
/**
* The aggregate type for ENVELOPE(expression).
*/
ENVELOPE
,
}
private
static
final
HashMap
<
String
,
AggregateType
>
AGGREGATES
=
new
HashMap
<>(
64
);
private
final
AggregateType
aggregateType
;
...
...
@@ -218,6 +111,8 @@ public class Aggregate extends AbstractAggregate {
addAggregate
(
"HISTOGRAM"
,
AggregateType
.
HISTOGRAM
);
addAggregate
(
"BIT_OR"
,
AggregateType
.
BIT_OR
);
addAggregate
(
"BIT_AND"
,
AggregateType
.
BIT_AND
);
addAggregate
(
"PERCENTILE_CONT"
,
AggregateType
.
PERCENTILE_CONT
);
addAggregate
(
"PERCENTILE_DISC"
,
AggregateType
.
PERCENTILE_DISC
);
addAggregate
(
"MEDIAN"
,
AggregateType
.
MEDIAN
);
addAggregate
(
"ARRAY_AGG"
,
AggregateType
.
ARRAY_AGG
);
addAggregate
(
"MODE"
,
AggregateType
.
MODE
);
...
...
@@ -294,14 +189,27 @@ public class Aggregate extends AbstractAggregate {
}
private
void
updateData
(
Session
session
,
AggregateData
data
,
Value
v
,
Value
[]
remembered
)
{
if
(
aggregateType
==
AggregateType
.
GROUP_CONCAT
)
{
switch
(
aggregateType
)
{
case
GROUP_CONCAT:
if
(
v
!=
ValueNull
.
INSTANCE
)
{
v
=
updateCollecting
(
session
,
v
.
convertTo
(
Value
.
STRING
),
remembered
);
}
}
else
if
(
aggregateType
==
AggregateType
.
ARRAY_AGG
)
{
break
;
case
ARRAY_AGG:
if
(
v
!=
ValueNull
.
INSTANCE
)
{
v
=
updateCollecting
(
session
,
v
,
remembered
);
}
break
;
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
((
AggregateDataCollecting
)
data
).
setSharedArgument
(
v
);
v
=
remembered
!=
null
?
remembered
[
1
]
:
orderByList
.
get
(
0
).
expression
.
getValue
(
session
);
break
;
case
MODE:
v
=
remembered
!=
null
?
remembered
[
0
]
:
orderByList
.
get
(
0
).
expression
.
getValue
(
session
);
break
;
default
:
// Use argument as is
}
data
.
add
(
session
.
getDatabase
(),
v
);
}
...
...
@@ -408,8 +316,23 @@ public class Aggregate extends AbstractAggregate {
}
return
v
;
}
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
{
Value
v
=
on
.
getValue
(
session
);
if
(
v
==
ValueNull
.
INSTANCE
)
{
return
ValueNull
.
INSTANCE
;
}
double
arg
=
v
.
getDouble
();
if
(
arg
>=
0
d
&&
arg
<=
1
d
)
{
return
Percentile
.
getFromIndex
(
session
,
orderByList
.
get
(
0
).
expression
,
type
.
getValueType
(),
orderByList
,
arg
,
aggregateType
==
AggregateType
.
PERCENTILE_CONT
);
}
else
{
throw
DbException
.
getInvalidValueException
(
aggregateType
==
AggregateType
.
PERCENTILE_CONT
?
"PERCENTILE_CONT argument"
:
"PERCENTILE_DISC argument"
,
arg
);
}
}
case
MEDIAN:
return
AggregateMedian
.
medianFromIndex
(
session
,
on
,
type
.
getValueType
()
);
return
Percentile
.
getFromIndex
(
session
,
on
,
type
.
getValueType
(),
orderByList
,
0.5d
,
true
);
case
ENVELOPE:
return
((
MVSpatialIndex
)
AggregateDataEnvelope
.
getGeometryColumnIndex
(
on
)).
getBounds
(
session
);
default
:
...
...
@@ -468,12 +391,32 @@ public class Aggregate extends AbstractAggregate {
}
return
ValueArray
.
get
(
array
);
}
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
{
AggregateDataCollecting
collectingData
=
(
AggregateDataCollecting
)
data
;
Value
[]
array
=
collectingData
.
getArray
();
if
(
array
==
null
)
{
return
ValueNull
.
INSTANCE
;
}
Value
v
=
collectingData
.
getSharedArgument
();
if
(
v
==
ValueNull
.
INSTANCE
)
{
return
ValueNull
.
INSTANCE
;
}
double
arg
=
v
.
getDouble
();
if
(
arg
>=
0
d
&&
arg
<=
1
d
)
{
return
Percentile
.
getValue
(
session
.
getDatabase
(),
array
,
type
.
getValueType
(),
orderByList
,
arg
,
aggregateType
==
AggregateType
.
PERCENTILE_CONT
);
}
else
{
throw
DbException
.
getInvalidValueException
(
aggregateType
==
AggregateType
.
PERCENTILE_CONT
?
"PERCENTILE_CONT argument"
:
"PERCENTILE_DISC argument"
,
arg
);
}
}
case
MEDIAN:
{
Value
[]
array
=
((
AggregateDataCollecting
)
data
).
getArray
();
if
(
array
==
null
)
{
return
ValueNull
.
INSTANCE
;
}
return
AggregateMedian
.
median
(
session
.
getDatabase
(),
array
,
type
.
getValueType
()
);
return
Percentile
.
getValue
(
session
.
getDatabase
(),
array
,
type
.
getValueType
(),
orderByList
,
0.5d
,
true
);
}
case
MODE:
return
getMode
(
session
,
data
);
...
...
@@ -647,7 +590,11 @@ public class Aggregate extends AbstractAggregate {
case
MIN:
case
MAX:
case
MEDIAN:
break
;
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
case
MODE:
type
=
orderByList
.
get
(
0
).
expression
.
getType
();
break
;
case
STDDEV_POP:
case
STDDEV_SAMP:
...
...
@@ -772,6 +719,12 @@ public class Aggregate extends AbstractAggregate {
case
BIT_OR:
text
=
"BIT_OR"
;
break
;
case
PERCENTILE_CONT:
text
=
"PERCENTILE_CONT"
;
break
;
case
PERCENTILE_DISC:
text
=
"PERCENTILE_DISC"
;
break
;
case
MEDIAN:
text
=
"MEDIAN"
;
break
;
...
...
@@ -792,11 +745,18 @@ public class Aggregate extends AbstractAggregate {
on
.
getSQL
(
builder
).
append
(
')'
);
}
else
{
builder
.
append
(
'('
);
if
(
on
!=
null
)
{
if
(
on
instanceof
Subquery
)
{
on
.
getSQL
(
builder
);
}
else
{
on
.
getUnenclosedSQL
(
builder
);
}
}
builder
.
append
(
')'
);
}
if
(
orderByList
!=
null
)
{
builder
.
append
(
" WITHIN GROUP ("
);
Window
.
appendOrderBy
(
builder
,
orderByList
);
builder
.
append
(
')'
);
}
return
appendTailConditions
(
builder
);
...
...
@@ -836,11 +796,14 @@ public class Aggregate extends AbstractAggregate {
case
MAX:
Index
index
=
getMinMaxColumnIndex
();
return
index
!=
null
;
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
return
on
.
isConstant
()
&&
Percentile
.
getColumnIndex
(
orderByList
.
get
(
0
).
expression
)
!=
null
;
case
MEDIAN:
if
(
distinct
)
{
return
false
;
}
return
AggregateMedian
.
getMedian
ColumnIndex
(
on
)
!=
null
;
return
Percentile
.
get
ColumnIndex
(
on
)
!=
null
;
case
ENVELOPE:
return
AggregateDataEnvelope
.
getGeometryColumnIndex
(
on
)
!=
null
;
default
:
...
...
h2/src/main/org/h2/expression/aggregate/AggregateData.java
浏览文件 @
92091718
...
...
@@ -7,7 +7,6 @@ package org.h2.expression.aggregate;
import
org.h2.engine.Constants
;
import
org.h2.engine.Database
;
import
org.h2.expression.aggregate.Aggregate.AggregateType
;
import
org.h2.message.DbException
;
import
org.h2.value.Value
;
...
...
@@ -35,6 +34,8 @@ abstract class AggregateData {
break
;
case
GROUP_CONCAT:
case
ARRAY_AGG:
case
PERCENTILE_CONT:
case
PERCENTILE_DISC:
case
MEDIAN:
break
;
case
MIN:
...
...
h2/src/main/org/h2/expression/aggregate/AggregateDataCollecting.java
浏览文件 @
92091718
...
...
@@ -11,7 +11,9 @@ import java.util.Collections;
import
java.util.Iterator
;
import
java.util.TreeSet
;
import
org.h2.api.ErrorCode
;
import
org.h2.engine.Database
;
import
org.h2.message.DbException
;
import
org.h2.value.Value
;
import
org.h2.value.ValueNull
;
...
...
@@ -31,6 +33,8 @@ class AggregateDataCollecting extends AggregateData implements Iterable<Value> {
Collection
<
Value
>
values
;
private
Value
shared
;
/**
* Creates new instance of data for collecting aggregates.
*
...
...
@@ -84,4 +88,27 @@ class AggregateDataCollecting extends AggregateData implements Iterable<Value> {
return
values
!=
null
?
values
.
iterator
()
:
Collections
.<
Value
>
emptyIterator
();
}
/**
* Sets value of a shared argument.
*
* @param shared the shared value
*/
void
setSharedArgument
(
Value
shared
)
{
if
(
this
.
shared
==
null
)
{
this
.
shared
=
shared
;
}
else
if
(!
this
.
shared
.
equals
(
shared
))
{
throw
DbException
.
get
(
ErrorCode
.
INVALID_VALUE_2
,
"Inverse distribution function argument"
,
this
.
shared
.
getTraceSQL
()
+
"<>"
+
shared
.
getTraceSQL
());
}
}
/**
* Returns value of a shared argument.
*
* @return value of a shared argument
*/
Value
getSharedArgument
()
{
return
shared
;
}
}
h2/src/main/org/h2/expression/aggregate/AggregateDataDefault.java
浏览文件 @
92091718
...
...
@@ -6,7 +6,6 @@
package
org
.
h2
.
expression
.
aggregate
;
import
org.h2.engine.Database
;
import
org.h2.expression.aggregate.Aggregate.AggregateType
;
import
org.h2.message.DbException
;
import
org.h2.value.DataType
;
import
org.h2.value.Value
;
...
...
h2/src/main/org/h2/expression/aggregate/AggregateType.java
0 → 100644
浏览文件 @
92091718
/*
* Copyright 2004-2019 H2 Group. Multiple-Licensed under the MPL 2.0,
* and the EPL 1.0 (http://h2database.com/html/license.html).
* Initial Developer: H2 Group
*/
package
org
.
h2
.
expression
.
aggregate
;
/**
* The type of an aggregate function.
*/
public
enum
AggregateType
{
/**
* The aggregate type for COUNT(*).
*/
COUNT_ALL
,
/**
* The aggregate type for COUNT(expression).
*/
COUNT
,
/**
* The aggregate type for GROUP_CONCAT(...).
*/
GROUP_CONCAT
,
/**
* The aggregate type for SUM(expression).
*/
SUM
,
/**
* The aggregate type for MIN(expression).
*/
MIN
,
/**
* The aggregate type for MAX(expression).
*/
MAX
,
/**
* The aggregate type for AVG(expression).
*/
AVG
,
/**
* The aggregate type for STDDEV_POP(expression).
*/
STDDEV_POP
,
/**
* The aggregate type for STDDEV_SAMP(expression).
*/
STDDEV_SAMP
,
/**
* The aggregate type for VAR_POP(expression).
*/
VAR_POP
,
/**
* The aggregate type for VAR_SAMP(expression).
*/
VAR_SAMP
,
/**
* The aggregate type for ANY(expression).
*/
ANY
,
/**
* The aggregate type for EVERY(expression).
*/
EVERY
,
/**
* The aggregate type for BOOL_OR(expression).
*/
BIT_OR
,
/**
* The aggregate type for BOOL_AND(expression).
*/
BIT_AND
,
/**
* The aggregate type for SELECTIVITY(expression).
*/
SELECTIVITY
,
/**
* The aggregate type for HISTOGRAM(expression).
*/
HISTOGRAM
,
/**
* The aggregate type for PERCENTILE_CONT(expression).
*/
PERCENTILE_CONT
,
/**
* The aggregate type for PERCENTILE_DISC(expression).
*/
PERCENTILE_DISC
,
/**
* The aggregate type for MEDIAN(expression).
*/
MEDIAN
,
/**
* The aggregate type for ARRAY_AGG(expression).
*/
ARRAY_AGG
,
/**
* The aggregate type for MODE(expression).
*/
MODE
,
/**
* The aggregate type for ENVELOPE(expression).
*/
ENVELOPE
,
}
h2/src/main/org/h2/expression/aggregate/
AggregateMedian
.java
→
h2/src/main/org/h2/expression/aggregate/
Percentile
.java
浏览文件 @
92091718
...
...
@@ -6,10 +6,12 @@
package
org
.
h2
.
expression
.
aggregate
;
import
java.math.BigDecimal
;
import
java.math.BigInteger
;
import
java.util.ArrayList
;
import
java.util.Arrays
;
import
org.h2.api.IntervalQualifier
;
import
org.h2.command.dml.SelectOrderBy
;
import
org.h2.engine.Database
;
import
org.h2.engine.Mode
;
import
org.h2.engine.Session
;
...
...
@@ -41,25 +43,26 @@ import org.h2.value.ValueTimestamp;
import
org.h2.value.ValueTimestampTimeZone
;
/**
*
MEDIAN aggregate
.
*
PERCENTILE_CONT, PERCENTILE_DISC, and MEDIAN inverse distribution functions
.
*/
final
class
AggregateMedian
{
final
class
Percentile
{
private
static
boolean
isNullsLast
(
Index
index
)
{
IndexColumn
ic
=
index
.
getIndexColumns
()[
0
];
int
sortType
=
ic
.
sortType
;
return
(
sortType
&
SortOrder
.
NULLS_LAST
)
!=
0
||
(
sortType
&
SortOrder
.
NULLS_FIRST
)
==
0
&&
(
(
sortType
&
SortOrder
.
DESCENDING
)
!=
0
^
SysProperties
.
SORT_NULLS_HIGH
)
;
&&
(
sortType
&
SortOrder
.
DESCENDING
)
!=
0
^
SysProperties
.
SORT_NULLS_HIGH
;
}
/**
* Get the index (if any) for the column specified in the median aggregate.
* Get the index (if any) for the column specified in the inverse
* distribution function.
*
* @param on the expression (usually a column expression)
* @return the index, or null
*/
static
Index
get
Median
ColumnIndex
(
Expression
on
)
{
static
Index
getColumnIndex
(
Expression
on
)
{
if
(
on
instanceof
ExpressionColumn
)
{
ExpressionColumn
col
=
(
ExpressionColumn
)
on
;
Column
column
=
col
.
getColumn
();
...
...
@@ -92,35 +95,64 @@ final class AggregateMedian {
}
/**
* Get the
median
from the array of values.
* Get the
result
from the array of values.
*
* @param database the database
* @param array array with values
* @param dataType the data type
* @param orderByList ORDER BY list
* @param percentile argument of percentile function, or 0.5d for median
* @param interpolate whether value should be interpolated
* @return the result
*/
static
Value
median
(
Database
database
,
Value
[]
array
,
int
dataType
)
{
static
Value
getValue
(
Database
database
,
Value
[]
array
,
int
dataType
,
ArrayList
<
SelectOrderBy
>
orderByList
,
double
percentile
,
boolean
interpolate
)
{
final
CompareMode
compareMode
=
database
.
getCompareMode
();
Arrays
.
sort
(
array
,
compareMode
);
int
len
=
array
.
length
;
int
idx
=
len
/
2
;
Value
v1
=
array
[
idx
];
if
((
len
&
1
)
==
1
)
{
return
v1
.
convertTo
(
dataType
);
int
count
=
array
.
length
;
boolean
reverseIndex
=
orderByList
!=
null
&&
(
orderByList
.
get
(
0
).
sortType
&
SortOrder
.
DESCENDING
)
!=
0
;
double
fpRow
=
(
count
-
1
)
*
percentile
;
int
rowIdx1
=
(
int
)
fpRow
;
double
factor
=
fpRow
-
rowIdx1
;
int
rowIdx2
;
if
(
factor
==
0
d
)
{
interpolate
=
false
;
rowIdx2
=
rowIdx1
;
}
else
{
rowIdx2
=
rowIdx1
+
1
;
if
(!
interpolate
)
{
if
(
factor
>
0.5d
)
{
rowIdx1
=
rowIdx2
;
}
else
{
rowIdx2
=
rowIdx1
;
}
}
}
if
(
reverseIndex
)
{
rowIdx1
=
count
-
1
-
rowIdx1
;
rowIdx2
=
count
-
1
-
rowIdx2
;
}
Value
v
=
array
[
rowIdx1
];
if
(!
interpolate
)
{
return
v
.
convertTo
(
dataType
);
}
return
getMedian
(
array
[
idx
-
1
],
v1
,
dataType
,
database
.
getMode
(),
compareMode
);
return
interpolate
(
v
,
array
[
rowIdx2
],
factor
,
dataType
,
database
.
getMode
(),
compareMode
);
}
/**
* Get the
median
from the index.
* Get the
result
from the index.
*
* @param session the session
* @param on the expression
* @param
expressi
on the expression
* @param dataType the data type
* @param orderByList ORDER BY list
* @param percentile argument of percentile function, or 0.5d for median
* @param interpolate whether value should be interpolated
* @return the result
*/
static
Value
medianFromIndex
(
Session
session
,
Expression
on
,
int
dataType
)
{
Index
index
=
getMedianColumnIndex
(
on
);
static
Value
getFromIndex
(
Session
session
,
Expression
expression
,
int
dataType
,
ArrayList
<
SelectOrderBy
>
orderByList
,
double
percentile
,
boolean
interpolate
)
{
Index
index
=
getColumnIndex
(
expression
);
long
count
=
index
.
getRowCount
(
session
);
if
(
count
==
0
)
{
return
ValueNull
.
INSTANCE
;
...
...
@@ -128,7 +160,7 @@ final class AggregateMedian {
Cursor
cursor
=
index
.
find
(
session
,
null
,
null
);
cursor
.
next
();
int
columnId
=
index
.
getColumns
()[
0
].
getColumnId
();
ExpressionColumn
expr
=
(
ExpressionColumn
)
on
;
ExpressionColumn
expr
=
(
ExpressionColumn
)
expressi
on
;
if
(
expr
.
getColumn
().
isNullable
())
{
boolean
hasNulls
=
false
;
SearchRow
row
;
...
...
@@ -165,7 +197,26 @@ final class AggregateMedian {
}
}
}
long
skip
=
(
count
-
1
)
/
2
;
boolean
reverseIndex
=
(
orderByList
!=
null
?
orderByList
.
get
(
0
).
sortType
&
SortOrder
.
DESCENDING
:
0
)
!=
(
index
.
getIndexColumns
()[
0
].
sortType
&
SortOrder
.
DESCENDING
);
double
fpRow
=
(
count
-
1
)
*
percentile
;
long
rowIdx1
=
(
long
)
fpRow
;
double
factor
=
fpRow
-
rowIdx1
;
long
rowIdx2
;
if
(
factor
==
0
d
)
{
interpolate
=
false
;
rowIdx2
=
rowIdx1
;
}
else
{
rowIdx2
=
rowIdx1
+
1
;
if
(!
interpolate
)
{
if
(
factor
>
0.5d
)
{
rowIdx1
=
rowIdx2
;
}
else
{
rowIdx2
=
rowIdx1
;
}
}
}
long
skip
=
reverseIndex
?
count
-
1
-
rowIdx2
:
rowIdx1
;
for
(
int
i
=
0
;
i
<
skip
;
i
++)
{
cursor
.
next
();
}
...
...
@@ -177,7 +228,7 @@ final class AggregateMedian {
if
(
v
==
ValueNull
.
INSTANCE
)
{
return
v
;
}
if
(
(
count
&
1
)
==
0
)
{
if
(
interpolate
)
{
cursor
.
next
();
row
=
cursor
.
getSearchRow
();
if
(
row
==
null
)
{
...
...
@@ -188,73 +239,90 @@ final class AggregateMedian {
return
v
;
}
Database
database
=
session
.
getDatabase
();
return
getMedian
(
v
,
v2
,
dataType
,
database
.
getMode
(),
database
.
getCompareMode
());
if
(
reverseIndex
)
{
Value
t
=
v
;
v
=
v2
;
v2
=
t
;
}
return
interpolate
(
v
,
v2
,
factor
,
dataType
,
database
.
getMode
(),
database
.
getCompareMode
());
}
return
v
;
}
private
static
Value
getMedian
(
Value
v0
,
Value
v1
,
int
dataType
,
Mode
databaseMode
,
CompareMode
compareMode
)
{
int
cmp
=
v0
.
compareTo
(
v1
,
databaseMode
,
compareMode
);
if
(
cmp
==
0
)
{
private
static
Value
interpolate
(
Value
v0
,
Value
v1
,
double
factor
,
int
dataType
,
Mode
databaseMode
,
CompareMode
compareMode
)
{
if
(
v0
.
compareTo
(
v1
,
databaseMode
,
compareMode
)
==
0
)
{
return
v0
.
convertTo
(
dataType
);
}
switch
(
dataType
)
{
case
Value
.
BYTE
:
case
Value
.
SHORT
:
case
Value
.
INT
:
return
ValueInt
.
get
((
v0
.
getInt
()
+
v1
.
getInt
())
/
2
).
convertTo
(
dataType
);
return
ValueInt
.
get
((
int
)
(
v0
.
getInt
()
*
(
1
-
factor
)
+
v1
.
getInt
()
*
factor
)
).
convertTo
(
dataType
);
case
Value
.
LONG
:
return
ValueLong
.
get
((
v0
.
getLong
()
+
v1
.
getLong
())
/
2
);
return
ValueLong
.
get
(
interpolateDecimal
(
BigDecimal
.
valueOf
(
v0
.
getLong
()),
BigDecimal
.
valueOf
(
v1
.
getLong
()),
factor
)
.
longValue
());
case
Value
.
DECIMAL
:
return
ValueDecimal
.
get
(
v0
.
getBigDecimal
().
add
(
v1
.
getBigDecimal
()).
divide
(
BigDecimal
.
valueOf
(
2
)
));
return
ValueDecimal
.
get
(
interpolateDecimal
(
v0
.
getBigDecimal
(),
v1
.
getBigDecimal
(),
factor
));
case
Value
.
FLOAT
:
return
ValueFloat
.
get
((
v0
.
getFloat
()
+
v1
.
getFloat
())
/
2
);
return
ValueFloat
.
get
(
interpolateDecimal
(
BigDecimal
.
valueOf
(
v0
.
getFloat
()),
BigDecimal
.
valueOf
(
v1
.
getFloat
()),
factor
)
.
floatValue
());
case
Value
.
DOUBLE
:
return
ValueDouble
.
get
((
v0
.
getFloat
()
+
v1
.
getDouble
())
/
2
);
return
ValueDouble
.
get
(
interpolateDecimal
(
BigDecimal
.
valueOf
(
v0
.
getDouble
()),
BigDecimal
.
valueOf
(
v1
.
getDouble
()),
factor
)
.
doubleValue
());
case
Value
.
TIME
:
{
ValueTime
t0
=
(
ValueTime
)
v0
.
convertTo
(
Value
.
TIME
),
t1
=
(
ValueTime
)
v1
.
convertTo
(
Value
.
TIME
);
return
ValueTime
.
fromNanos
((
t0
.
getNanos
()
+
t1
.
getNanos
())
/
2
);
BigDecimal
n0
=
BigDecimal
.
valueOf
(
t0
.
getNanos
());
BigDecimal
n1
=
BigDecimal
.
valueOf
(
t1
.
getNanos
());
return
ValueTime
.
fromNanos
(
interpolateDecimal
(
n0
,
n1
,
factor
).
longValue
());
}
case
Value
.
DATE
:
{
ValueDate
d0
=
(
ValueDate
)
v0
.
convertTo
(
Value
.
DATE
),
d1
=
(
ValueDate
)
v1
.
convertTo
(
Value
.
DATE
);
BigDecimal
a0
=
BigDecimal
.
valueOf
(
DateTimeUtils
.
absoluteDayFromDateValue
(
d0
.
getDateValue
()));
BigDecimal
a1
=
BigDecimal
.
valueOf
(
DateTimeUtils
.
absoluteDayFromDateValue
(
d1
.
getDateValue
()));
return
ValueDate
.
fromDateValue
(
DateTimeUtils
.
dateValueFromAbsoluteDay
((
DateTimeUtils
.
absoluteDayFromDateValue
(
d0
.
getDateValue
())
+
DateTimeUtils
.
absoluteDayFromDateValue
(
d1
.
getDateValue
()))
/
2
));
DateTimeUtils
.
dateValueFromAbsoluteDay
(
interpolateDecimal
(
a0
,
a1
,
factor
).
longValue
()));
}
case
Value
.
TIMESTAMP
:
{
ValueTimestamp
ts0
=
(
ValueTimestamp
)
v0
.
convertTo
(
Value
.
TIMESTAMP
),
ts1
=
(
ValueTimestamp
)
v1
.
convertTo
(
Value
.
TIMESTAMP
);
long
dateSum
=
DateTimeUtils
.
absoluteDayFromDateValue
(
ts0
.
getDateValue
())
+
DateTimeUtils
.
absoluteDayFromDateValue
(
ts1
.
getDateValue
());
long
nanos
=
(
ts0
.
getTimeNanos
()
+
ts1
.
getTimeNanos
())
/
2
;
if
((
dateSum
&
1
)
!=
0
)
{
nanos
+=
DateTimeUtils
.
NANOS_PER_DAY
/
2
;
if
(
nanos
>=
DateTimeUtils
.
NANOS_PER_DAY
)
{
nanos
-=
DateTimeUtils
.
NANOS_PER_DAY
;
dateSum
++;
}
}
return
ValueTimestamp
.
fromDateValueAndNanos
(
DateTimeUtils
.
dateValueFromAbsoluteDay
(
dateSum
/
2
),
nanos
);
BigDecimal
a0
=
timestampToDecimal
(
ts0
.
getDateValue
(),
ts0
.
getTimeNanos
());
BigDecimal
a1
=
timestampToDecimal
(
ts1
.
getDateValue
(),
ts1
.
getTimeNanos
());
BigInteger
[]
dr
=
interpolateDecimal
(
a0
,
a1
,
factor
).
toBigInteger
()
.
divideAndRemainder
(
IntervalUtils
.
NANOS_PER_DAY_BI
);
long
absoluteDay
=
dr
[
0
].
longValue
();
long
timeNanos
=
dr
[
1
].
longValue
();
if
(
timeNanos
<
0
)
{
timeNanos
+=
DateTimeUtils
.
NANOS_PER_DAY
;
absoluteDay
--;
}
return
ValueTimestamp
.
fromDateValueAndNanos
(
DateTimeUtils
.
dateValueFromAbsoluteDay
(
absoluteDay
),
timeNanos
);
}
case
Value
.
TIMESTAMP_TZ
:
{
ValueTimestampTimeZone
ts0
=
(
ValueTimestampTimeZone
)
v0
.
convertTo
(
Value
.
TIMESTAMP_TZ
),
ts1
=
(
ValueTimestampTimeZone
)
v1
.
convertTo
(
Value
.
TIMESTAMP_TZ
);
long
dateSum
=
DateTimeUtils
.
absoluteDayFromDateValue
(
ts0
.
getDateValue
())
+
DateTimeUtils
.
absoluteDayFromDateValue
(
ts1
.
getDateValue
());
long
nanos
=
(
ts0
.
getTimeNanos
()
+
ts1
.
getTimeNanos
())
/
2
;
int
offset
=
ts0
.
getTimeZoneOffsetMins
()
+
ts1
.
getTimeZoneOffsetMins
();
if
((
dateSum
&
1
)
!=
0
)
{
nanos
+=
DateTimeUtils
.
NANOS_PER_DAY
/
2
;
}
if
((
offset
&
1
)
!=
0
)
{
nanos
+=
30_000_000_000L
;
}
if
(
nanos
>=
DateTimeUtils
.
NANOS_PER_DAY
)
{
nanos
-=
DateTimeUtils
.
NANOS_PER_DAY
;
dateSum
++;
}
return
ValueTimestampTimeZone
.
fromDateValueAndNanos
(
DateTimeUtils
.
dateValueFromAbsoluteDay
(
dateSum
/
2
),
nanos
,
(
short
)
(
offset
/
2
));
BigDecimal
a0
=
timestampToDecimal
(
ts0
.
getDateValue
(),
ts0
.
getTimeNanos
());
BigDecimal
a1
=
timestampToDecimal
(
ts1
.
getDateValue
(),
ts1
.
getTimeNanos
());
double
offset
=
ts0
.
getTimeZoneOffsetMins
()
*
(
1
-
factor
)
+
ts1
.
getTimeZoneOffsetMins
()
*
factor
;
short
sOffset
=
(
short
)
offset
;
BigDecimal
bd
=
interpolateDecimal
(
a0
,
a1
,
factor
);
if
(
offset
!=
sOffset
)
{
bd
=
bd
.
add
(
BigDecimal
.
valueOf
(
offset
-
sOffset
)
.
multiply
(
BigDecimal
.
valueOf
(
DateTimeUtils
.
NANOS_PER_MINUTE
)));
}
BigInteger
[]
dr
=
bd
.
toBigInteger
().
divideAndRemainder
(
IntervalUtils
.
NANOS_PER_DAY_BI
);
long
absoluteDay
=
dr
[
0
].
longValue
();
long
timeNanos
=
dr
[
1
].
longValue
();
if
(
timeNanos
<
0
)
{
timeNanos
+=
DateTimeUtils
.
NANOS_PER_DAY
;
absoluteDay
--;
}
return
ValueTimestampTimeZone
.
fromDateValueAndNanos
(
DateTimeUtils
.
dateValueFromAbsoluteDay
(
absoluteDay
),
timeNanos
,
sOffset
);
}
case
Value
.
INTERVAL_YEAR
:
case
Value
.
INTERVAL_MONTH
:
...
...
@@ -270,15 +338,25 @@ final class AggregateMedian {
case
Value
.
INTERVAL_HOUR_TO_SECOND
:
case
Value
.
INTERVAL_MINUTE_TO_SECOND
:
return
IntervalUtils
.
intervalFromAbsolute
(
IntervalQualifier
.
valueOf
(
dataType
-
Value
.
INTERVAL_YEAR
),
IntervalUtils
.
intervalToAbsolute
((
ValueInterval
)
v0
)
.
add
(
IntervalUtils
.
intervalToAbsolute
((
ValueInterval
)
v1
)).
shiftRight
(
1
));
interpolateDecimal
(
new
BigDecimal
(
IntervalUtils
.
intervalToAbsolute
((
ValueInterval
)
v0
)),
new
BigDecimal
(
IntervalUtils
.
intervalToAbsolute
((
ValueInterval
)
v1
)),
factor
)
.
toBigInteger
());
default
:
// Just return smaller
return
(
cmp
<
0
?
v0
:
v1
).
convertTo
(
dataType
);
// Use the same rules as PERCENTILE_DISC
return
(
factor
>
0.5d
?
v1
:
v0
).
convertTo
(
dataType
);
}
}
private
static
BigDecimal
timestampToDecimal
(
long
dateValue
,
long
timeNanos
)
{
return
new
BigDecimal
(
BigInteger
.
valueOf
(
DateTimeUtils
.
absoluteDayFromDateValue
(
dateValue
))
.
multiply
(
IntervalUtils
.
NANOS_PER_DAY_BI
).
add
(
BigInteger
.
valueOf
(
timeNanos
)));
}
private
static
BigDecimal
interpolateDecimal
(
BigDecimal
d0
,
BigDecimal
d1
,
double
factor
)
{
return
d0
.
multiply
(
BigDecimal
.
valueOf
(
1
-
factor
)).
add
(
d1
.
multiply
(
BigDecimal
.
valueOf
(
factor
)));
}
private
AggregateMedian
()
{
private
Percentile
()
{
}
}
h2/src/main/org/h2/expression/condition/Comparison.java
浏览文件 @
92091718
...
...
@@ -15,7 +15,7 @@ import org.h2.expression.ExpressionVisitor;
import
org.h2.expression.Parameter
;
import
org.h2.expression.ValueExpression
;
import
org.h2.expression.aggregate.Aggregate
;
import
org.h2.expression.aggregate.Aggregate
.Aggregate
Type
;
import
org.h2.expression.aggregate.AggregateType
;
import
org.h2.index.IndexCondition
;
import
org.h2.message.DbException
;
import
org.h2.table.Column
;
...
...
h2/src/main/org/h2/mvstore/MVMap.java
浏览文件 @
92091718
...
...
@@ -841,16 +841,16 @@ public class MVMap<K, V> extends AbstractMap<K, V>
/**
* Use the new root page from now on.
* @param expectedRootRefrence expected current root reference
* @param expectedRootRef
e
rence expected current root reference
* @param newRootPage the new root page
* @param attemptUpdateCounter how many attempt (including current)
* were made to update root
* @return new RootReference or null if update failed
*/
protected
final
boolean
updateRoot
(
RootReference
expectedRootRef
rence
,
Page
newRootPage
,
int
attemptUpdateCounter
)
{
protected
final
boolean
updateRoot
(
RootReference
expectedRootRef
erence
,
Page
newRootPage
,
int
attemptUpdateCounter
)
{
RootReference
currentRoot
=
flushAndGetRoot
();
return
currentRoot
==
expectedRootRefrence
&&
return
currentRoot
==
expectedRootRef
e
rence
&&
!
currentRoot
.
lockedForUpdate
&&
root
.
compareAndSet
(
currentRoot
,
new
RootReference
(
currentRoot
,
newRootPage
,
attemptUpdateCounter
));
...
...
h2/src/main/org/h2/mvstore/MVStore.java
浏览文件 @
92091718
...
...
@@ -506,7 +506,8 @@ public class MVStore implements AutoCloseable {
if
(
id
>=
0
)
{
map
=
openMap
(
id
,
builder
);
assert
builder
.
getKeyType
()
==
null
||
map
.
getKeyType
().
getClass
().
equals
(
builder
.
getKeyType
().
getClass
());
assert
builder
.
getValueType
()
==
null
||
map
.
getValueType
().
getClass
().
equals
(
builder
.
getValueType
().
getClass
());
assert
builder
.
getValueType
()
==
null
||
map
.
getValueType
().
getClass
().
equals
(
builder
.
getValueType
()
.
getClass
());
}
else
{
HashMap
<
String
,
Object
>
c
=
new
HashMap
<>();
id
=
lastMapId
.
incrementAndGet
();
...
...
h2/src/main/org/h2/value/Value.java
浏览文件 @
92091718
...
...
@@ -435,13 +435,13 @@ public abstract class Value extends VersionedValue {
return
43_000
;
case
GEOMETRY:
return
44_000
;
case
ENUM:
return
45_000
;
case
ARRAY:
return
50_000
;
case
ROW:
return
50_500
;
case
RESULT_SET:
return
51_000
;
case
ENUM
:
case
RESULT_SET
:
return
52_000
;
default
:
if
(
JdbcUtils
.
customDataTypesHandler
!=
null
)
{
...
...
@@ -1337,18 +1337,19 @@ public abstract class Value extends VersionedValue {
return
ValueArray
.
get
(
a
);
}
private
Value
Row
convertToRow
()
{
private
Value
convertToRow
()
{
Value
[]
a
;
switch
(
getValueType
())
{
case
ARRAY:
a
=
((
ValueArray
)
this
).
getList
();
break
;
case
BLOB:
case
CLOB:
case
RESULT_SET:
a
=
new
Value
[]
{
ValueString
.
get
(
getString
())
};
break
;
default
:
if
(
getValueType
()
==
RESULT_SET
)
{
ResultInterface
result
=
((
ValueResultSet
)
this
).
getResult
();
if
(
result
.
hasNext
())
{
a
=
result
.
currentRow
();
if
(
result
.
hasNext
())
{
throw
DbException
.
get
(
ErrorCode
.
SCALAR_SUBQUERY_CONTAINS_MORE_THAN_ONE_ROW
);
}
}
else
{
return
ValueNull
.
INSTANCE
;
}
}
else
{
a
=
new
Value
[]
{
this
};
}
return
ValueRow
.
get
(
a
);
...
...
h2/src/test/org/h2/test/scripts/TestScript.java
浏览文件 @
92091718
...
...
@@ -165,7 +165,7 @@ public class TestScript extends TestDb {
testScript
(
"other/"
+
s
+
".sql"
);
}
for
(
String
s
:
new
String
[]
{
"any"
,
"array-agg"
,
"avg"
,
"bit-and"
,
"bit-or"
,
"count"
,
"envelope"
,
"every"
,
"group-concat"
,
"histogram"
,
"max"
,
"m
edian"
,
"min"
,
"mod
e"
,
"selectivity"
,
"stddev-pop"
,
"every"
,
"group-concat"
,
"histogram"
,
"max"
,
"m
in"
,
"mode"
,
"percentil
e"
,
"selectivity"
,
"stddev-pop"
,
"stddev-samp"
,
"sum"
,
"var-pop"
,
"var-samp"
})
{
testScript
(
"functions/aggregate/"
+
s
+
".sql"
);
}
...
...
h2/src/test/org/h2/test/scripts/datatypes/array.sql
浏览文件 @
92091718
...
...
@@ -84,5 +84,13 @@ SELECT ARRAY[1, NULL] IN (SELECT A FROM TEST);
SELECT
ROW
(
ARRAY
[
1
,
NULL
])
IN
(
SELECT
A
FROM
TEST
);
>>
null
-- Compatibility with H2 1.4.197 and older
SELECT
A
FROM
TEST
WHERE
A
=
(
1
,
2
);
>>
[
1
,
2
]
-- Compatibility with H2 1.4.197 and older
INSERT
INTO
TEST
VALUES
((
1
,
3
));
>
update
count
:
1
DROP
TABLE
TEST
;
>
ok
h2/src/test/org/h2/test/scripts/functions/aggregate/mode.sql
浏览文件 @
92091718
...
...
@@ -21,10 +21,16 @@ SELECT MODE(V) FROM TEST;
INSERT
INTO
TEST
VALUES
(
1
),
(
2
),
(
3
),
(
1
),
(
2
),
(
1
);
>
update
count
:
6
SELECT
MODE
(
V
),
MODE
(
V
)
FILTER
(
WHERE
(
V
>
1
)),
MODE
(
V
)
FILTER
(
WHERE
(
V
<
0
))
FROM
TEST
;
>
MODE
(
V
)
MODE
(
V
)
FILTER
(
WHERE
(
V
>
1
))
MODE
(
V
)
FILTER
(
WHERE
(
V
<
0
))
>
------- ------------------------------ ------------------------------
>
1
2
null
SELECT
MODE
(
V
),
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
FROM
TEST
;
>
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
>
-------------------------------- -------------------------------------
>
1
1
>
rows
:
1
SELECT
MODE
(
V
)
FILTER
(
WHERE
(
V
>
1
)),
MODE
(
V
)
FILTER
(
WHERE
(
V
<
0
))
FROM
TEST
;
>
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
FILTER
(
WHERE
(
V
>
1
))
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
FILTER
(
WHERE
(
V
<
0
))
>
------------------------------------------------------- -------------------------------------------------------
>
2
null
>
rows
:
1
-- Oracle compatibility
...
...
@@ -46,15 +52,35 @@ SELECT MODE(V ORDER BY V DESC) FROM TEST;
SELECT
MODE
(
V
ORDER
BY
V
+
1
)
FROM
TEST
;
>
exception
IDENTICAL_EXPRESSIONS_SHOULD_BE_USED
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
FROM
TEST
;
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
FROM
TEST
;
>>
1
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
ASC
)
FROM
TEST
;
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
ASC
)
FROM
TEST
;
>>
1
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
FROM
TEST
;
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
FROM
TEST
;
>>
3
SELECT
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
OVER
()
MA
,
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
OVER
()
MD
,
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
)
OVER
(
ORDER
BY
V
)
MWA
,
MODE
()
WITHIN
GROUP
(
ORDER
BY
V
DESC
)
OVER
(
ORDER
BY
V
)
MWD
,
V
FROM
TEST
;
>
MA
MD
MWA
MWD
V
>
-- -- ---- ---- ----
>
1
3
1
1
1
>
1
3
1
1
1
>
1
3
1
1
1
>
1
3
1
2
2
>
1
3
1
2
2
>
1
3
1
2
2
>
1
3
1
3
3
>
1
3
1
3
3
>
1
3
1
3
3
>
1
3
null
null
null
>
rows
:
10
DROP
TABLE
TEST
;
>
ok
...
...
h2/src/test/org/h2/test/scripts/functions/aggregate/
median
.sql
→
h2/src/test/org/h2/test/scripts/functions/aggregate/
percentile
.sql
浏览文件 @
92091718
...
...
@@ -13,14 +13,26 @@ create index test_idx on test(v asc);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -28,8 +40,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -44,14 +62,26 @@ create index test_idx on test(v asc nulls first);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -59,8 +89,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -75,14 +111,26 @@ create index test_idx on test(v asc nulls last);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -90,8 +138,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -106,14 +160,26 @@ create index test_idx on test(v desc);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -121,8 +187,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -137,14 +209,26 @@ create index test_idx on test(v desc nulls first);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -152,8 +236,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -168,14 +258,26 @@ create index test_idx on test(v desc nulls last);
insert
into
test
values
(
20
),
(
20
),
(
10
);
>
update
count
:
3
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
insert
into
test
values
(
null
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
20
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
20
20
20
>
rows
:
1
select
median
(
distinct
v
)
from
test
;
>>
15
...
...
@@ -183,8 +285,14 @@ select median(distinct v) from test;
insert
into
test
values
(
10
);
>
update
count
:
1
select
median
(
v
)
from
test
;
>>
15
select
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50a
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
median
(
v
)
m
from
test
;
>
D50A
D50D
M
>
---- ---- --
>
10
20
15
>
rows
:
1
drop
table
test
;
>
ok
...
...
@@ -465,6 +573,12 @@ insert into test values ('2000-01-20 20:00:00'), ('2000-01-21 20:00:00');
select
median
(
v
)
from
test
;
>>
2000
-
01
-
21
08
:
00
:
00
insert
into
test
values
(
'-2000-01-10 10:00:00'
),
(
'-2000-01-10 10:00:01'
);
>
update
count
:
2
select
percentile_cont
(
0
.
16
)
within
group
(
order
by
v
)
from
test
;
>>
-
2000
-
01
-
10
10
:
00
:
00
.
48
drop
table
test
;
>
ok
...
...
@@ -501,6 +615,15 @@ insert into test values ('2000-01-20 20:00:00+10:15'), ('2000-01-21 20:00:00-09'
select
median
(
v
)
from
test
;
>>
2000
-
01
-
21
08
:
00
:
30
+
00
:
37
delete
from
test
;
>
update
count
:
2
insert
into
test
values
(
'-2000-01-20 20:00:00+10:15'
),
(
'-2000-01-21 20:00:00-09'
);
>
update
count
:
2
select
median
(
v
)
from
test
;
>>
-
2000
-
01
-
21
08
:
00
:
30
+
00
:
37
drop
table
test
;
>
ok
...
...
@@ -665,3 +788,113 @@ select dept, median(amount) filter (where amount >= 20) from test
drop
table
test
;
>
ok
create
table
test
(
g
int
,
v
int
);
>
ok
insert
into
test
values
(
1
,
1
),
(
1
,
2
),
(
1
,
3
),
(
1
,
4
),
(
1
,
5
),
(
1
,
6
),
(
1
,
7
),
(
1
,
8
),
(
1
,
9
),
(
1
,
10
),
(
2
,
10
),
(
2
,
20
),
(
2
,
30
),
(
2
,
100
);
>
update
count
:
14
select
percentile_cont
(
0
.
05
)
within
group
(
order
by
v
)
c05a
,
percentile_cont
(
0
.
05
)
within
group
(
order
by
v
desc
)
c05d
,
percentile_cont
(
0
.
5
)
within
group
(
order
by
v
)
c50
,
percentile_cont
(
0
.
5
)
within
group
(
order
by
v
desc
)
c50d
,
percentile_cont
(
0
.
95
)
within
group
(
order
by
v
)
c95a
,
percentile_cont
(
0
.
95
)
within
group
(
order
by
v
desc
)
c95d
,
g
from
test
group
by
g
;
>
C05A
C05D
C50
C50D
C95A
C95D
G
>
---- ---- --- ---- ---- ---- -
>
1
9
5
5
9
1
1
>
11
89
25
25
89
11
2
>
rows
:
2
select
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
)
d05a
,
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
desc
)
d05d
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
)
d95a
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
desc
)
d95d
,
g
from
test
group
by
g
;
>
D05A
D05D
D50
D50D
D95A
D95D
G
>
---- ---- --- ---- ---- ---- -
>
1
10
5
6
10
1
1
>
10
100
20
30
100
10
2
>
rows
:
2
select
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
)
over
(
partition
by
g
order
by
v
)
d05a
,
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
desc
)
over
(
partition
by
g
order
by
v
)
d05d
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
over
(
partition
by
g
order
by
v
)
d50
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
over
(
partition
by
g
order
by
v
)
d50d
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
)
over
(
partition
by
g
order
by
v
)
d95a
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
desc
)
over
(
partition
by
g
order
by
v
)
d95d
,
g
,
v
from
test
order
by
g
,
v
;
>
D05A
D05D
D50
D50D
D95A
D95D
G
V
>
---- ---- --- ---- ---- ---- - ---
>
1
1
1
1
1
1
1
1
>
1
2
1
2
2
1
1
2
>
1
3
2
2
3
1
1
3
>
1
4
2
3
4
1
1
4
>
1
5
3
3
5
1
1
5
>
1
6
3
4
6
1
1
6
>
1
7
4
4
7
1
1
7
>
1
8
4
5
8
1
1
8
>
1
9
5
5
9
1
1
9
>
1
10
5
6
10
1
1
10
>
10
10
10
10
10
10
2
10
>
10
20
10
20
20
10
2
20
>
10
30
20
20
30
10
2
30
>
10
100
20
30
100
10
2
100
>
rows
(
ordered
):
14
delete
from
test
where
g
<>
1
;
>
update
count
:
4
create
index
test_idx
on
test
(
v
);
>
ok
select
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
)
d05a
,
percentile_disc
(
0
.
05
)
within
group
(
order
by
v
desc
)
d05d
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
)
d50
,
percentile_disc
(
0
.
5
)
within
group
(
order
by
v
desc
)
d50d
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
)
d95a
,
percentile_disc
(
0
.
95
)
within
group
(
order
by
v
desc
)
d95d
from
test
;
>
D05A
D05D
D50
D50D
D95A
D95D
>
---- ---- --- ---- ---- ----
>
1
10
5
6
10
1
>
rows
:
1
SELECT
percentile_disc
(
null
)
within
group
(
order
by
v
)
from
test
;
>>
null
SELECT
percentile_disc
(
-
0
.
01
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
SELECT
percentile_disc
(
1
.
01
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
SELECT
percentile_disc
(
v
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
drop
index
test_idx
;
>
ok
SELECT
percentile_disc
(
null
)
within
group
(
order
by
v
)
from
test
;
>>
null
SELECT
percentile_disc
(
-
0
.
01
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
SELECT
percentile_disc
(
1
.
01
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
SELECT
percentile_disc
(
v
)
within
group
(
order
by
v
)
from
test
;
>
exception
INVALID_VALUE_2
drop
table
test
;
>
ok
h2/src/tools/org/h2/build/doc/dictionary.txt
浏览文件 @
92091718
...
...
@@ -806,4 +806,4 @@ econd irst bcef ordinality nord unnest
analyst occupation distributive josaph aor engineer sajeewa isuru randil kevin doctor businessman artist ashan
corrupts splitted disruption unintentional octets preconditions predicates subq objectweb insn opcodes
preserves masking holder unboxing avert iae transformed subtle reevaluate exclusions subclause ftbl rgr
presorted inclusion contexts aax
presorted inclusion contexts aax
mwd percentile cont interpolate mwa
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论