Bug 1642940 - Factor out data/elem segment implementation limits. r=lth

Data and element segment decoding used the initial memory/table limit values
which will be removed in a later commit, this commit gives data/elem segments
their own implementation defined limit to prevent interference.

Differential Revision: https://phabricator.services.mozilla.com/D80137
This commit is contained in:
Ryan Hunt 2020-06-25 05:21:36 +00:00
Родитель e88b049c20
Коммит 55957b9c23
2 изменённых файлов: 4 добавлений и 2 удалений

Просмотреть файл

@ -849,7 +849,9 @@ static const unsigned MaxImports = 100000;
static const unsigned MaxExports = 100000;
static const unsigned MaxGlobals = 1000000;
static const unsigned MaxDataSegments = 100000;
static const unsigned MaxDataSegmentLengthPages = 16384;
static const unsigned MaxElemSegments = 10000000;
static const unsigned MaxElemSegmentLength = 10000000;
static const unsigned MaxTableLength = 10000000;
static const unsigned MaxLocals = 50000;
static const unsigned MaxParams = 1000;

Просмотреть файл

@ -2735,7 +2735,7 @@ static bool DecodeElemSection(Decoder& d, ModuleEnvironment* env) {
return d.fail("expected segment size");
}
if (numElems > MaxTableInitialLength) {
if (numElems > MaxElemSegmentLength) {
return d.fail("too many table elements");
}
@ -3052,7 +3052,7 @@ static bool DecodeDataSection(Decoder& d, ModuleEnvironment* env) {
return d.fail("expected segment size");
}
if (seg.length > MaxMemoryInitialPages * PageSize) {
if (seg.length > MaxDataSegmentLengthPages * PageSize) {
return d.fail("segment size too big");
}