mirror of
https://github.com/dart-lang/sdk
synced 2024-10-06 15:19:41 +00:00
[CFE et al] Optimize presubmit scripts
This CL optimizes how CFE et al presubmits are run. In the examples below we'll that it takes the presubmit time from 31+ to ~13 seconds, from 31+ to ~20 seconds and from 30+ to ~19 seconds on a few simple cases and from 76+ to ~27 seconds in a case where files in both _fe_analyzer_shared, front_end, frontend_server and kernel are changed. Before this CL, if there was changes in both front_end and frontend_server for instance it would run one smoke-test for each. They would each technically only test things in their own directory, but they would do a lot of overlapping work, e.g. compiling frontend_server also compiles front_end; the startup cost of a script is done several times etc. The bulk of the change in this CL is thus to only run things once. Now, if there is a change in both front_end and frontend_server the python presubmit will still launch a script for each, but it's just a light-weight script that will take ~400 ms to run (on my machine) if it decides to not do anything. What it does is that it looks at the changed files, from that it will know which presubmits will be run and decide which of them will actually do the work - the rest will just exit and say "it will be tested by this other one". Furthermore it then tries to run only the smoke tests necessary. For instance, if you have only changed a test in front_end it will only run the spell checker (and only for that file). Note that this is not perfect and there can be cases where you should get a presubmit error but wont. For instance if you remove all content from the spellchecking dictionary file it should give you lots of spelling mistake errors, but it won't because it won't actually run the spell checker (as no files it should spell check was changed). Probably you have to actively try to cheat it though, so I don't see it as a big problem. Things will still be checked fully on the CI. Additionally * the generated messages will have trailing commas which speeds up formatting of the generated files (in the cases where the generated files will have to be checked). * the explicit creation testing tool will do the outline of everything, but only do the bodies of the changed files. * building the "ast model" only compiles the outline. Left to do: * If only changing a single test, for instance, it will only run the spell checker on that file, but launching the isolate its run in still takes ~7 seconds because it loads up other stuff too. Maybe we could have special entry points for cases where it only should run an otherwise simple test. * The presubmit in the sdk dir (not CFE related) doesn't do well with many (big) changed files and testing them for formatting errors can easily take 10+ seconds (see example below where it contributes ~5 seconds for instance). Maybe `dart format` could be made faster, or maybe the script should test more than one file at once. *Example runs before and after*: Change in a single test file in front_end ========================================= Now: ``` $ time git cl presubmit -v -f [I2024-01-25 09:46:08,391 187077 140400494405504 presubmit_support.py] Found 1 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py Presubmit checks took 11.5s to calculate. Python 3 presubmit checks passed. real 0m12.772s user 0m16.093s sys 0m2.146s ``` Before: ``` $ time git cl presubmit -v -f [I2024-01-25 10:07:08,519 200015 140338735470464 presubmit_support.py] Found 1 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py 28.3s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Presubmit checks took 30.0s to calculate. Python 3 presubmit checks passed. real 0m31.396s user 2m9.500s sys 0m11.559s ``` So from 31+ to ~13 seconds. --------------------------------------------------------------------- Change in a single test file and a single lib file in front_end =============================================================== Now: ``` $ time git cl presubmit -v -f Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py 15.9s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Presubmit checks took 18.0s to calculate. Python 3 presubmit checks passed. real 0m19.365s user 0m33.157s sys 0m5.049s ``` Before: ``` $ time git cl presubmit -v -f [I2024-01-25 10:08:36,277 200953 140133274818432 presubmit_support.py] Found 2 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py 27.9s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Presubmit checks took 30.0s to calculate. Python 3 presubmit checks passed. real 0m31.311s user 2m9.854s sys 0m11.898s ``` So from 31+ to ~20 seconds. --------------------------------------------------------------------- Change only the messages file in front_end (but with generated files not changing) ================================================================================== Now: ``` $ time git cl presubmit -v -f [I2024-01-25 09:53:02,823 190466 140548397250432 presubmit_support.py] Found 1 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py 15.6s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Presubmit checks took 17.0s to calculate. Python 3 presubmit checks passed. real 0m18.326s user 0m38.999s sys 0m4.530s ``` Before: ``` $ time git cl presubmit -v -f [I2024-01-25 10:10:04,431 201892 140717686302592 presubmit_support.py] Found 1 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/front_end/PRESUBMIT.py 28.0s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Presubmit checks took 29.2s to calculate. Python 3 presubmit checks passed. real 0m30.550s user 2m9.488s sys 0m11.689s ``` So from 30+ to ~19 seconds. --------------------------------------------------------------------- Change several files: ``` $ git diff --stat pkg/_fe_analyzer_shared/lib/src/messages/codes_generated.dart | 4 ++-- pkg/_fe_analyzer_shared/lib/src/parser/listener.dart | 2 ++ pkg/front_end/lib/src/api_prototype/incremental_kernel_generator.dart | 2 ++ pkg/front_end/lib/src/base/processed_options.dart | 2 ++ pkg/front_end/messages.yaml | 2 +- pkg/front_end/tool/dart_doctest_impl.dart | 2 ++ pkg/frontend_server/lib/compute_kernel.dart | 2 ++ pkg/kernel/lib/ast.dart | 2 ++ 8 files changed, 15 insertions(+), 3 deletions(-) ``` ==================== Now: ``` [I2024-01-25 09:57:53,270 193911 140320429016960 presubmit_support.py] Found 8 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/_fe_analyzer_shared/PRESUBMIT.py 17.8s to run CheckChangeOnCommit from [...]/sdk/pkg/_fe_analyzer_shared/PRESUBMIT.py. Running [...]/sdk/pkg/front_end/PRESUBMIT.py Running [...]/sdk/pkg/frontend_server/PRESUBMIT.py Running [...]/sdk/pkg/kernel/PRESUBMIT.py Presubmit checks took 25.3s to calculate. Python 3 presubmit checks passed. real 0m26.585s user 1m8.997s sys 0m8.742s ``` Worth noting here is that "sdk/PRESUBMIT.py" takes 5+ seconds here Before: ``` [I2024-01-25 10:11:39,863 203026 140202046494592 presubmit_support.py] Found 8 file(s). Running Python 3 presubmit commit checks ... Running [...]/sdk/PRESUBMIT.py Running [...]/sdk/pkg/_fe_analyzer_shared/PRESUBMIT.py 14.6s to run CheckChangeOnCommit from [...]/sdk/pkg/_fe_analyzer_shared/PRESUBMIT.py. Running [...]/sdk/pkg/front_end/PRESUBMIT.py 28.0s to run CheckChangeOnCommit from [...]/sdk/pkg/front_end/PRESUBMIT.py. Running [...]/sdk/pkg/frontend_server/PRESUBMIT.py 20.9s to run CheckChangeOnCommit from [...]/sdk/pkg/frontend_server/PRESUBMIT.py. Running [...]/sdk/pkg/kernel/PRESUBMIT.py Presubmit checks took 75.6s to calculate. Python 3 presubmit checks passed. real 1m16.870s user 3m48.784s sys 0m23.689s ``` So from 76+ to ~27 seconds. In response to https://github.com/dart-lang/sdk/issues/54665 Change-Id: I59a43f5009bba8c2fdcb5d3a843b4cb408499214 Reviewed-on: https://dart-review.googlesource.com/c/sdk/+/348301 Commit-Queue: Jens Johansen <jensj@google.com> Reviewed-by: Johnni Winther <johnniwinther@google.com>
This commit is contained in:
parent
70e4ff3e1a
commit
f0ca213d60
|
@ -1,8 +1,8 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
|
# Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
# for details. All rights reserved. Use of this source code is governed by a
|
# for details. All rights reserved. Use of this source code is governed by a
|
||||||
# BSD-style license that can be found in the LICENSE file.
|
# BSD-style license that can be found in the LICENSE file.
|
||||||
"""Shared front-end analyzer specific presubmit script.
|
"""CFE et al presubmit python script.
|
||||||
|
|
||||||
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
||||||
for more details about the presubmit API built into gcl.
|
for more details about the presubmit API built into gcl.
|
||||||
|
@ -30,45 +30,35 @@ def load_source(modname, filename):
|
||||||
|
|
||||||
|
|
||||||
def runSmokeTest(input_api, output_api):
|
def runSmokeTest(input_api, output_api):
|
||||||
hasChangedFiles = False
|
local_root = input_api.change.RepositoryRoot()
|
||||||
for git_file in input_api.AffectedTextFiles():
|
utils = load_source('utils', os.path.join(local_root, 'tools', 'utils.py'))
|
||||||
filename = git_file.AbsoluteLocalPath()
|
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
||||||
if filename.endswith(".dart"):
|
test_helper = os.path.join(local_root, 'pkg', 'front_end',
|
||||||
hasChangedFiles = True
|
'presubmit_helper.dart')
|
||||||
break
|
|
||||||
|
|
||||||
if hasChangedFiles:
|
windows = utils.GuessOS() == 'win32'
|
||||||
local_root = input_api.change.RepositoryRoot()
|
if windows:
|
||||||
utils = load_source('utils',
|
dart += '.exe'
|
||||||
os.path.join(local_root, 'tools', 'utils.py'))
|
|
||||||
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
|
||||||
smoke_test = os.path.join(local_root, 'pkg', '_fe_analyzer_shared',
|
|
||||||
'tool', 'smoke_test_quick.dart')
|
|
||||||
|
|
||||||
windows = utils.GuessOS() == 'win32'
|
if not os.path.isfile(dart):
|
||||||
if windows:
|
print('WARNING: dart not found: %s' % dart)
|
||||||
dart += '.exe'
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(dart):
|
if not os.path.isfile(test_helper):
|
||||||
print('WARNING: dart not found: %s' % dart)
|
print('WARNING: CFE et al presubmit_helper not found: %s' % test_helper)
|
||||||
return []
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(smoke_test):
|
args = [dart, test_helper, input_api.PresubmitLocalPath()]
|
||||||
print('WARNING: _fe_analyzer_shared smoke test not found: %s' %
|
process = subprocess.Popen(args,
|
||||||
smoke_test)
|
stdout=subprocess.PIPE,
|
||||||
return []
|
stdin=subprocess.PIPE)
|
||||||
|
outs, _ = process.communicate()
|
||||||
|
|
||||||
args = [dart, smoke_test]
|
if process.returncode != 0:
|
||||||
process = subprocess.Popen(
|
return [
|
||||||
args, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
|
output_api.PresubmitError('CFE et al presubmit script failure(s):',
|
||||||
outs, _ = process.communicate()
|
long_text=outs)
|
||||||
|
]
|
||||||
if process.returncode != 0:
|
|
||||||
return [
|
|
||||||
output_api.PresubmitError(
|
|
||||||
'_fe_analyzer_shared smoke test failure(s):',
|
|
||||||
long_text=outs)
|
|
||||||
]
|
|
||||||
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,8 +1,8 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
|
# Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
# for details. All rights reserved. Use of this source code is governed by a
|
# for details. All rights reserved. Use of this source code is governed by a
|
||||||
# BSD-style license that can be found in the LICENSE file.
|
# BSD-style license that can be found in the LICENSE file.
|
||||||
"""Front-end specific presubmit script.
|
"""CFE et al presubmit python script.
|
||||||
|
|
||||||
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
||||||
for more details about the presubmit API built into gcl.
|
for more details about the presubmit API built into gcl.
|
||||||
|
@ -30,42 +30,35 @@ def load_source(modname, filename):
|
||||||
|
|
||||||
|
|
||||||
def runSmokeTest(input_api, output_api):
|
def runSmokeTest(input_api, output_api):
|
||||||
hasChangedFiles = False
|
local_root = input_api.change.RepositoryRoot()
|
||||||
for git_file in input_api.AffectedTextFiles():
|
utils = load_source('utils', os.path.join(local_root, 'tools', 'utils.py'))
|
||||||
filename = git_file.AbsoluteLocalPath()
|
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
||||||
if filename.endswith(".dart") or filename.endswith("messages.yaml"):
|
test_helper = os.path.join(local_root, 'pkg', 'front_end',
|
||||||
hasChangedFiles = True
|
'presubmit_helper.dart')
|
||||||
break
|
|
||||||
|
|
||||||
if hasChangedFiles:
|
windows = utils.GuessOS() == 'win32'
|
||||||
local_root = input_api.change.RepositoryRoot()
|
if windows:
|
||||||
utils = load_source('utils',
|
dart += '.exe'
|
||||||
os.path.join(local_root, 'tools', 'utils.py'))
|
|
||||||
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
|
||||||
smoke_test = os.path.join(local_root, 'pkg', 'front_end', 'tool',
|
|
||||||
'smoke_test_quick.dart')
|
|
||||||
|
|
||||||
windows = utils.GuessOS() == 'win32'
|
if not os.path.isfile(dart):
|
||||||
if windows:
|
print('WARNING: dart not found: %s' % dart)
|
||||||
dart += '.exe'
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(dart):
|
if not os.path.isfile(test_helper):
|
||||||
print('WARNING: dart not found: %s' % dart)
|
print('WARNING: CFE et al presubmit_helper not found: %s' % test_helper)
|
||||||
return []
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(smoke_test):
|
args = [dart, test_helper, input_api.PresubmitLocalPath()]
|
||||||
print('WARNING: Front-end smoke test not found: %s' % smoke_test)
|
process = subprocess.Popen(args,
|
||||||
return []
|
stdout=subprocess.PIPE,
|
||||||
|
stdin=subprocess.PIPE)
|
||||||
|
outs, _ = process.communicate()
|
||||||
|
|
||||||
args = [dart, smoke_test]
|
if process.returncode != 0:
|
||||||
process = subprocess.Popen(
|
return [
|
||||||
args, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
|
output_api.PresubmitError('CFE et al presubmit script failure(s):',
|
||||||
outs, _ = process.communicate()
|
long_text=outs)
|
||||||
|
]
|
||||||
if process.returncode != 0:
|
|
||||||
return [output_api.PresubmitError(
|
|
||||||
'Front-end smoke test failure(s):',
|
|
||||||
long_text=outs)]
|
|
||||||
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
|
@ -52,16 +52,22 @@ Future<CompilerResult?> kernelForProgram(Uri source, CompilerOptions options,
|
||||||
}
|
}
|
||||||
|
|
||||||
Future<CompilerResult?> kernelForProgramInternal(
|
Future<CompilerResult?> kernelForProgramInternal(
|
||||||
Uri source, CompilerOptions options,
|
Uri source,
|
||||||
{List<Uri> additionalSources = const <Uri>[],
|
CompilerOptions options, {
|
||||||
bool retainDataForTesting = false,
|
List<Uri> additionalSources = const <Uri>[],
|
||||||
bool requireMain = true}) async {
|
bool retainDataForTesting = false,
|
||||||
|
bool requireMain = true,
|
||||||
|
bool buildComponent = true,
|
||||||
|
}) async {
|
||||||
ProcessedOptions pOptions = new ProcessedOptions(
|
ProcessedOptions pOptions = new ProcessedOptions(
|
||||||
options: options, inputs: [source, ...additionalSources]);
|
options: options, inputs: [source, ...additionalSources]);
|
||||||
return await CompilerContext.runWithOptions(pOptions, (context) async {
|
return await CompilerContext.runWithOptions(pOptions, (context) async {
|
||||||
CompilerResult result = await generateKernelInternal(
|
CompilerResult result = await generateKernelInternal(
|
||||||
includeHierarchyAndCoreTypes: true,
|
includeHierarchyAndCoreTypes: true,
|
||||||
retainDataForTesting: retainDataForTesting);
|
retainDataForTesting: retainDataForTesting,
|
||||||
|
buildComponent: buildComponent,
|
||||||
|
);
|
||||||
|
|
||||||
Component? component = result.component;
|
Component? component = result.component;
|
||||||
if (component == null) return null;
|
if (component == null) return null;
|
||||||
|
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -217,6 +217,8 @@ Future<CompilerResult> _buildInternal(
|
||||||
showOffsets: options.debugDumpShowOffsets);
|
showOffsets: options.debugDumpShowOffsets);
|
||||||
}
|
}
|
||||||
options.ticker.logMs("Generated component");
|
options.ticker.logMs("Generated component");
|
||||||
|
} else {
|
||||||
|
component = summaryComponent;
|
||||||
}
|
}
|
||||||
// TODO(johnniwinther): Should we reuse the macro executor on subsequent
|
// TODO(johnniwinther): Should we reuse the macro executor on subsequent
|
||||||
// compilations where possible?
|
// compilations where possible?
|
||||||
|
|
674
pkg/front_end/presubmit_helper.dart
Normal file
674
pkg/front_end/presubmit_helper.dart
Normal file
|
@ -0,0 +1,674 @@
|
||||||
|
// Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
|
// for details. All rights reserved. Use of this source code is governed by a
|
||||||
|
// BSD-style license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// Warning: This file has to start up fast so we can't import lots of stuff.
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:io';
|
||||||
|
import 'dart:isolate';
|
||||||
|
|
||||||
|
import 'test/utils/io_utils.dart';
|
||||||
|
|
||||||
|
Future<void> main(List<String> args) async {
|
||||||
|
Stopwatch stopwatch = new Stopwatch()..start();
|
||||||
|
// Expect something like /full/path/to/sdk/pkg/some_dir/whatever/else
|
||||||
|
if (args.length != 1) throw "Need exactly one argument.";
|
||||||
|
|
||||||
|
final List<String> changedFiles = _getChangedFiles();
|
||||||
|
String callerPath = args[0].replaceAll("\\", "/");
|
||||||
|
if (!_shouldRun(changedFiles, callerPath)) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
List<Work> workItems = [];
|
||||||
|
|
||||||
|
// This run is now the only run that will actually run any smoke tests.
|
||||||
|
// First collect all relevant smoke tests.
|
||||||
|
// Note that this is *not* perfect, e.g. it might think there's no reason for
|
||||||
|
// a test because the tested hasn't changed even though the actual test has.
|
||||||
|
// E.g. if you only update the spelling dictionary no spell test will be run
|
||||||
|
// because the files being spell-tested hasn't changed.
|
||||||
|
workItems.addIfNotNull(_createExplicitCreationTestWork(changedFiles));
|
||||||
|
workItems.addIfNotNull(_createMessagesTestWork(changedFiles));
|
||||||
|
workItems.addIfNotNull(_createSpellingTestNotSourceWork(changedFiles));
|
||||||
|
workItems.addIfNotNull(_createSpellingTestSourceWork(changedFiles));
|
||||||
|
workItems.addIfNotNull(_createLintWork(changedFiles));
|
||||||
|
workItems.addIfNotNull(_createDepsTestWork(changedFiles));
|
||||||
|
bool shouldRunGenerateFilesTest = _shouldRunGenerateFilesTest(changedFiles);
|
||||||
|
|
||||||
|
// Then run them if we have any.
|
||||||
|
if (workItems.isEmpty && !shouldRunGenerateFilesTest) {
|
||||||
|
print("Nothing to do.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
List<Future> futures = [];
|
||||||
|
if (shouldRunGenerateFilesTest) {
|
||||||
|
print("Running generated_files_up_to_date_git_test in different process.");
|
||||||
|
futures.add(_run(
|
||||||
|
"pkg/front_end/test/generated_files_up_to_date_git_test.dart",
|
||||||
|
const []));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (workItems.isNotEmpty) {
|
||||||
|
print("Will now run ${workItems.length} tests.");
|
||||||
|
futures.add(_executePendingWorkItems(workItems));
|
||||||
|
}
|
||||||
|
|
||||||
|
await Future.wait(futures);
|
||||||
|
print("All done in ${stopwatch.elapsed}");
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Map from a dir name in "pkg" to the inner-dir we want to include in the
|
||||||
|
/// explicit creation test.
|
||||||
|
const Map<String, String> _explicitCreationDirs = {
|
||||||
|
"frontend_server": "",
|
||||||
|
"front_end": "lib/",
|
||||||
|
"_fe_analyzer_shared": "lib/",
|
||||||
|
};
|
||||||
|
|
||||||
|
/// This is currently a representative list of the dependencies, but do update
|
||||||
|
/// if it turns out to be needed.
|
||||||
|
const Set<String> _generatedFilesUpToDateFiles = {
|
||||||
|
"pkg/_fe_analyzer_shared/lib/src/experiments/flags.dart",
|
||||||
|
"pkg/_fe_analyzer_shared/lib/src/messages/codes_generated.dart",
|
||||||
|
"pkg/_fe_analyzer_shared/lib/src/parser/listener.dart",
|
||||||
|
"pkg/_fe_analyzer_shared/lib/src/parser/parser_impl.dart",
|
||||||
|
"pkg/front_end/lib/src/api_prototype/experimental_flags_generated.dart",
|
||||||
|
"pkg/front_end/lib/src/fasta/fasta_codes_cfe_generated.dart",
|
||||||
|
"pkg/front_end/lib/src/fasta/util/parser_ast_helper.dart",
|
||||||
|
"pkg/front_end/messages.yaml",
|
||||||
|
"pkg/front_end/test/generated_files_up_to_date_git_test.dart",
|
||||||
|
"pkg/front_end/test/parser_test_listener_creator.dart",
|
||||||
|
"pkg/front_end/test/parser_test_listener.dart",
|
||||||
|
"pkg/front_end/test/parser_test_parser_creator.dart",
|
||||||
|
"pkg/front_end/test/parser_test_parser.dart",
|
||||||
|
"pkg/front_end/tool/_fasta/generate_messages.dart",
|
||||||
|
"pkg/front_end/tool/_fasta/parser_ast_helper_creator.dart",
|
||||||
|
"pkg/front_end/tool/generate_ast_coverage.dart",
|
||||||
|
"pkg/front_end/tool/generate_ast_equivalence.dart",
|
||||||
|
"pkg/front_end/tool/visitor_generator.dart",
|
||||||
|
"pkg/kernel/lib/ast.dart",
|
||||||
|
"pkg/kernel/lib/default_language_version.dart",
|
||||||
|
"pkg/kernel/lib/src/ast/patterns.dart",
|
||||||
|
"pkg/kernel/lib/src/coverage.dart",
|
||||||
|
"pkg/kernel/lib/src/equivalence.dart",
|
||||||
|
"sdk/lib/libraries.json",
|
||||||
|
"tools/experimental_features.yaml",
|
||||||
|
};
|
||||||
|
|
||||||
|
/// Map from a dir name in "pkg" to the inner-dir we want to include in the
|
||||||
|
/// lint test.
|
||||||
|
const Map<String, String> _lintDirs = {
|
||||||
|
"frontend_server": "",
|
||||||
|
"front_end": "lib/",
|
||||||
|
"kernel": "lib/",
|
||||||
|
"_fe_analyzer_shared": "lib/",
|
||||||
|
};
|
||||||
|
|
||||||
|
/// Map from a dir name in "pkg" to the inner-dirs we want to include in the
|
||||||
|
/// spelling (source) test.
|
||||||
|
const Map<String, List<String>> _spellDirs = {
|
||||||
|
"frontend_server": ["lib/", "bin/"],
|
||||||
|
"kernel": ["lib/", "bin/"],
|
||||||
|
"front_end": ["lib/"],
|
||||||
|
"_fe_analyzer_shared": ["lib/"],
|
||||||
|
};
|
||||||
|
|
||||||
|
/// Set of dirs in "pkg" we care about.
|
||||||
|
const Set<String> _usDirs = {
|
||||||
|
"kernel",
|
||||||
|
"frontend_server",
|
||||||
|
"front_end",
|
||||||
|
"_fe_analyzer_shared",
|
||||||
|
};
|
||||||
|
|
||||||
|
final Uri _repoDir = computeRepoDirUri();
|
||||||
|
|
||||||
|
String get _dartVm => Platform.executable;
|
||||||
|
|
||||||
|
DepsTestWork? _createDepsTestWork(List<String> changedFiles) {
|
||||||
|
bool foundFiles = false;
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.endsWith(".dart")) continue;
|
||||||
|
if (path.startsWith("pkg/front_end/lib/")) {
|
||||||
|
foundFiles = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!foundFiles) return null;
|
||||||
|
|
||||||
|
return new DepsTestWork();
|
||||||
|
}
|
||||||
|
|
||||||
|
ExplicitCreationWork? _createExplicitCreationTestWork(
|
||||||
|
List<String> changedFiles) {
|
||||||
|
Set<Uri> includedDirs = {};
|
||||||
|
for (MapEntry<String, String> entry in _explicitCreationDirs.entries) {
|
||||||
|
includedDirs.add(_repoDir.resolve("pkg/${entry.key}/${entry.value}"));
|
||||||
|
}
|
||||||
|
|
||||||
|
Set<Uri> files = {};
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.endsWith(".dart")) continue;
|
||||||
|
bool found = false;
|
||||||
|
for (MapEntry<String, String> usDirEntry in _explicitCreationDirs.entries) {
|
||||||
|
if (path.startsWith("pkg/${usDirEntry.key}/${usDirEntry.value}")) {
|
||||||
|
found = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (!found) continue;
|
||||||
|
files.add(_repoDir.resolve(path));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (files.isEmpty) return null;
|
||||||
|
|
||||||
|
return new ExplicitCreationWork(
|
||||||
|
includedFiles: files,
|
||||||
|
includedDirectoryUris: includedDirs,
|
||||||
|
repoDir: _repoDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
LintWork? _createLintWork(List<String> changedFiles) {
|
||||||
|
List<String> filters = [];
|
||||||
|
pathLoop:
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.endsWith(".dart")) continue;
|
||||||
|
for (MapEntry<String, String> entry in _lintDirs.entries) {
|
||||||
|
if (path.startsWith("pkg/${entry.key}/${entry.value}")) {
|
||||||
|
String filter = path.substring("pkg/".length, path.length - 5);
|
||||||
|
filters.add("lint/$filter/...");
|
||||||
|
continue pathLoop;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.isEmpty) return null;
|
||||||
|
|
||||||
|
return new LintWork(filters: filters, repoDir: _repoDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
MessagesWork? _createMessagesTestWork(List<String> changedFiles) {
|
||||||
|
// TODO(jensj): Could we detect what ones are changed/added and only test
|
||||||
|
// those?
|
||||||
|
for (String file in changedFiles) {
|
||||||
|
if (file == "pkg/front_end/messages.yaml") {
|
||||||
|
return new MessagesWork(repoDir: _repoDir);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// messages.yaml not changed.
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
SpellNotSourceWork? _createSpellingTestNotSourceWork(
|
||||||
|
List<String> changedFiles) {
|
||||||
|
// TODO(jensj): Not here, but I'll add the note here.
|
||||||
|
// package:testing takes *a long time* listing files because it does
|
||||||
|
// ```
|
||||||
|
// if (suite.exclude.any((RegExp r) => path.contains(r))) continue;
|
||||||
|
// if (suite.pattern.any((RegExp r) => path.contains(r))) {}
|
||||||
|
// ```
|
||||||
|
// for each file it finds. Maybe it should do something more efficient,
|
||||||
|
// and maybe it should even take given filters into account at this point?
|
||||||
|
//
|
||||||
|
// Also it lists all files in the specified "path", so for instance for the
|
||||||
|
// src spell one we have to list all files in "pkg/", then filter it down to
|
||||||
|
// stuff in one of the dirs we care about.
|
||||||
|
List<String> filters = [];
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.endsWith(".dart")) continue;
|
||||||
|
if (path.startsWith("pkg/front_end/") &&
|
||||||
|
!path.startsWith("pkg/front_end/lib/")) {
|
||||||
|
// Remove front of path and ".dart".
|
||||||
|
String filter = path.substring("pkg/front_end/".length, path.length - 5);
|
||||||
|
filters.add("spelling_test_not_src/$filter");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.isEmpty) return null;
|
||||||
|
|
||||||
|
return new SpellNotSourceWork(filters: filters, repoDir: _repoDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
SpellSourceWork? _createSpellingTestSourceWork(List<String> changedFiles) {
|
||||||
|
List<String> filters = [];
|
||||||
|
pathLoop:
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.endsWith(".dart")) continue;
|
||||||
|
for (MapEntry<String, List<String>> entry in _spellDirs.entries) {
|
||||||
|
for (String subPath in entry.value) {
|
||||||
|
if (path.startsWith("pkg/${entry.key}/$subPath")) {
|
||||||
|
String filter = path.substring("pkg/".length, path.length - 5);
|
||||||
|
filters.add("spelling_test_src/$filter");
|
||||||
|
continue pathLoop;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.isEmpty) return null;
|
||||||
|
|
||||||
|
return new SpellSourceWork(filters: filters, repoDir: _repoDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<void> _executePendingWorkItems(List<Work> workItems) async {
|
||||||
|
int currentlyRunning = 0;
|
||||||
|
SpawnHelper spawnHelper = new SpawnHelper();
|
||||||
|
print("Waiting for spawn to start up.");
|
||||||
|
Stopwatch stopwatch = new Stopwatch()..start();
|
||||||
|
await spawnHelper
|
||||||
|
.spawn(_repoDir.resolve("pkg/front_end/presubmit_helper_spawn.dart"),
|
||||||
|
(dynamic ok) {
|
||||||
|
if (ok is! bool) {
|
||||||
|
exitCode = 1;
|
||||||
|
print("Error got message of type ${ok.runtimeType}");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
currentlyRunning--;
|
||||||
|
if (!ok) {
|
||||||
|
exitCode = 1;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
print("Isolate started in ${stopwatch.elapsed}");
|
||||||
|
|
||||||
|
for (Work workItem in workItems) {
|
||||||
|
print("Executing ${workItem.name}.");
|
||||||
|
currentlyRunning++;
|
||||||
|
spawnHelper.send(json.encode(workItem.toJson()));
|
||||||
|
}
|
||||||
|
|
||||||
|
while (currentlyRunning > 0) {
|
||||||
|
await Future.delayed(const Duration(milliseconds: 42));
|
||||||
|
}
|
||||||
|
spawnHelper.close();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Queries git about changes against upstream, or origin/main if no upstream is
|
||||||
|
/// set. This is similar (but different), I believe, to what
|
||||||
|
/// `git cl presubmit` does.
|
||||||
|
List<String> _getChangedFiles() {
|
||||||
|
ProcessResult result = Process.runSync(
|
||||||
|
"git",
|
||||||
|
[
|
||||||
|
"-c",
|
||||||
|
"core.quotePath=false",
|
||||||
|
"diff",
|
||||||
|
"--name-status",
|
||||||
|
"--no-renames",
|
||||||
|
"@{u}...HEAD"
|
||||||
|
],
|
||||||
|
runInShell: true);
|
||||||
|
if (result.exitCode != 0) {
|
||||||
|
result = Process.runSync(
|
||||||
|
"git",
|
||||||
|
[
|
||||||
|
"-c",
|
||||||
|
"core.quotePath=false",
|
||||||
|
"diff",
|
||||||
|
"--name-status",
|
||||||
|
"--no-renames",
|
||||||
|
"origin/main...HEAD"
|
||||||
|
],
|
||||||
|
runInShell: true);
|
||||||
|
}
|
||||||
|
if (result.exitCode != 0) {
|
||||||
|
throw "Failure";
|
||||||
|
}
|
||||||
|
|
||||||
|
List<String> paths = [];
|
||||||
|
for (String line in result.stdout.toString().split("\n")) {
|
||||||
|
List<String> split = line.split("\t");
|
||||||
|
if (split.length != 2) continue;
|
||||||
|
String path = split[1].trim().replaceAll("\\", "/");
|
||||||
|
paths.add(path);
|
||||||
|
}
|
||||||
|
return paths;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// If [inner] is a dir or file inside [outer] this returns the index into
|
||||||
|
/// `inner.pathSegments` corresponding to the folder- or filename directly
|
||||||
|
/// inside [outer].
|
||||||
|
/// If [inner] is not inside [outer] it returns null.
|
||||||
|
int? _getPathSegmentIndexIfSubEntry(Uri outer, Uri inner) {
|
||||||
|
List<String> outerPathSegments = outer.pathSegments;
|
||||||
|
List<String> innerPathSegments = inner.pathSegments;
|
||||||
|
if (innerPathSegments.length < outerPathSegments.length) return null;
|
||||||
|
int end = outerPathSegments.length;
|
||||||
|
if (outerPathSegments.last == "") end--;
|
||||||
|
for (int i = 0; i < end; i++) {
|
||||||
|
if (outerPathSegments[i] != innerPathSegments[i]) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return end;
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<void> _run(
|
||||||
|
String script,
|
||||||
|
List<String> scriptArguments,
|
||||||
|
) async {
|
||||||
|
List<String> arguments = [];
|
||||||
|
arguments.add("$script");
|
||||||
|
arguments.addAll(scriptArguments);
|
||||||
|
|
||||||
|
Stopwatch stopwatch = new Stopwatch()..start();
|
||||||
|
ProcessResult result = await Process.run(_dartVm, arguments,
|
||||||
|
workingDirectory: _repoDir.toFilePath());
|
||||||
|
String runWhat = "${_dartVm} ${arguments.join(' ')}";
|
||||||
|
if (result.exitCode != 0) {
|
||||||
|
exitCode = result.exitCode;
|
||||||
|
print("-----");
|
||||||
|
print("Running: $runWhat: "
|
||||||
|
"Failed with exit code ${result.exitCode} "
|
||||||
|
"in ${stopwatch.elapsedMilliseconds} ms.");
|
||||||
|
String stdout = result.stdout.toString();
|
||||||
|
stdout = stdout.trim();
|
||||||
|
if (stdout.isNotEmpty) {
|
||||||
|
print("--- stdout start ---");
|
||||||
|
print(stdout);
|
||||||
|
print("--- stdout end ---");
|
||||||
|
}
|
||||||
|
|
||||||
|
String stderr = result.stderr.toString().trim();
|
||||||
|
if (stderr.isNotEmpty) {
|
||||||
|
print("--- stderr start ---");
|
||||||
|
print(stderr);
|
||||||
|
print("--- stderr end ---");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
print("Running: $runWhat: Done in ${stopwatch.elapsedMilliseconds} ms.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// This script is potentially called from several places (once from each),
|
||||||
|
// but we only want to actually run it once. To that end we - from the changed
|
||||||
|
// files figure out which would call this script, and only if the caller is
|
||||||
|
// the top one (just alphabetically sorted) we actually run.
|
||||||
|
bool _shouldRun(final List<String> changedFiles, final String callerPath) {
|
||||||
|
Uri pkgDir = _repoDir.resolve("pkg/");
|
||||||
|
Uri callerUri = Uri.base.resolveUri(Uri.file(callerPath));
|
||||||
|
int? endPathIndex = _getPathSegmentIndexIfSubEntry(pkgDir, callerUri);
|
||||||
|
if (endPathIndex == null) {
|
||||||
|
throw "Unsupported path";
|
||||||
|
}
|
||||||
|
final String callerPkgDir = callerUri.pathSegments[endPathIndex];
|
||||||
|
if (!_usDirs.contains(callerPkgDir)) {
|
||||||
|
throw "Unsupported dir: $callerPkgDir -- expected one of $_usDirs.";
|
||||||
|
}
|
||||||
|
|
||||||
|
final Set<String> changedUsDirsSet = {};
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (!path.startsWith("pkg/")) continue;
|
||||||
|
List<String> paths = path.split("/");
|
||||||
|
if (paths.length < 2) continue;
|
||||||
|
if (_usDirs.contains(paths[1])) {
|
||||||
|
changedUsDirsSet.add(paths[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (changedUsDirsSet.isEmpty) {
|
||||||
|
print("We have no changes.");
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
final List<String> changedUsDirs = changedUsDirsSet.toList()..sort();
|
||||||
|
if (changedUsDirs.first != callerPkgDir) {
|
||||||
|
print("We expect this file to be called elsewhere which will do the work.");
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// The `generated_files_up_to_date_git_test.dart` file imports
|
||||||
|
/// package:dart_style which imports package:analyzer --- so it's a lot of extra
|
||||||
|
/// stuff to compile (and thus an expensive script to start).
|
||||||
|
/// Therefore it's not done in the same way as the other things, but instead
|
||||||
|
/// launched separately.
|
||||||
|
bool _shouldRunGenerateFilesTest(List<String> changedFiles) {
|
||||||
|
for (String path in changedFiles) {
|
||||||
|
if (_generatedFilesUpToDateFiles.contains(path)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
class DepsTestWork extends Work {
|
||||||
|
DepsTestWork();
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "Deps test";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.DepsTest.index,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new DepsTestWork();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class ExplicitCreationWork extends Work {
|
||||||
|
final Set<Uri> includedFiles;
|
||||||
|
final Set<Uri> includedDirectoryUris;
|
||||||
|
final Uri repoDir;
|
||||||
|
|
||||||
|
ExplicitCreationWork(
|
||||||
|
{required this.includedFiles,
|
||||||
|
required this.includedDirectoryUris,
|
||||||
|
required this.repoDir});
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "explicit creation test";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.ExplicitCreation.index,
|
||||||
|
"includedFiles": includedFiles.map((e) => e.toString()).toList(),
|
||||||
|
"includedDirectoryUris":
|
||||||
|
includedDirectoryUris.map((e) => e.toString()).toList(),
|
||||||
|
"repoDir": repoDir.toString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new ExplicitCreationWork(
|
||||||
|
includedFiles: Set<Uri>.from(
|
||||||
|
(json["includedFiles"] as Iterable).map((e) => Uri.parse(e))),
|
||||||
|
includedDirectoryUris: Set<Uri>.from(
|
||||||
|
(json["includedDirectoryUris"] as Iterable).map((e) => Uri.parse(e))),
|
||||||
|
repoDir: Uri.parse(json["repoDir"] as String),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class LintWork extends Work {
|
||||||
|
final List<String> filters;
|
||||||
|
final Uri repoDir;
|
||||||
|
|
||||||
|
LintWork({required this.filters, required this.repoDir});
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "Lint test";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.Lint.index,
|
||||||
|
"filters": filters,
|
||||||
|
"repoDir": repoDir.toString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new LintWork(
|
||||||
|
filters: List<String>.from(json["filters"] as Iterable),
|
||||||
|
repoDir: Uri.parse(json["repoDir"] as String),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class MessagesWork extends Work {
|
||||||
|
final Uri repoDir;
|
||||||
|
|
||||||
|
MessagesWork({required this.repoDir});
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "messages test";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.Messages.index,
|
||||||
|
"repoDir": repoDir.toString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new MessagesWork(
|
||||||
|
repoDir: Uri.parse(json["repoDir"] as String),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class SpawnHelper {
|
||||||
|
bool _spawned = false;
|
||||||
|
late ReceivePort _receivePort;
|
||||||
|
late SendPort _sendPort;
|
||||||
|
late void Function(dynamic data) onData;
|
||||||
|
final List<dynamic> data = [];
|
||||||
|
|
||||||
|
void close() {
|
||||||
|
if (!_spawned) throw "Not spawned!";
|
||||||
|
_receivePort.close();
|
||||||
|
}
|
||||||
|
|
||||||
|
void send(Object? message) {
|
||||||
|
if (!_spawned) throw "Not spawned!";
|
||||||
|
_sendPort.send(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<void> spawn(Uri spawnUri, void Function(dynamic data) onData) async {
|
||||||
|
if (_spawned) throw "Already spawned!";
|
||||||
|
_spawned = true;
|
||||||
|
this.onData = onData;
|
||||||
|
_receivePort = ReceivePort();
|
||||||
|
await Isolate.spawnUri(spawnUri, const [], _receivePort.sendPort);
|
||||||
|
final Completer<SendPort> sendPortCompleter = Completer<SendPort>();
|
||||||
|
_receivePort.listen((dynamic receivedData) {
|
||||||
|
if (!sendPortCompleter.isCompleted) {
|
||||||
|
sendPortCompleter.complete(receivedData);
|
||||||
|
} else {
|
||||||
|
onData(receivedData);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
_sendPort = await sendPortCompleter.future;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class SpellNotSourceWork extends Work {
|
||||||
|
final List<String> filters;
|
||||||
|
final Uri repoDir;
|
||||||
|
|
||||||
|
SpellNotSourceWork({required this.filters, required this.repoDir});
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "spell test not source";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.SpellingNotSource.index,
|
||||||
|
"filters": filters,
|
||||||
|
"repoDir": repoDir.toString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new SpellNotSourceWork(
|
||||||
|
filters: List<String>.from(json["filters"] as Iterable),
|
||||||
|
repoDir: Uri.parse(json["repoDir"] as String),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class SpellSourceWork extends Work {
|
||||||
|
final List<String> filters;
|
||||||
|
final Uri repoDir;
|
||||||
|
|
||||||
|
SpellSourceWork({required this.filters, required this.repoDir});
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => "spell test source";
|
||||||
|
|
||||||
|
@override
|
||||||
|
Map<String, Object?> toJson() {
|
||||||
|
return {
|
||||||
|
"WorkTypeIndex": WorkEnum.SpellingSource.index,
|
||||||
|
"filters": filters,
|
||||||
|
"repoDir": repoDir.toString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Work fromJson(Map<String, Object?> json) {
|
||||||
|
return new SpellSourceWork(
|
||||||
|
filters: List<String>.from(json["filters"] as Iterable),
|
||||||
|
repoDir: Uri.parse(json["repoDir"] as String),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sealed class Work {
|
||||||
|
String get name;
|
||||||
|
|
||||||
|
Map<String, Object?> toJson();
|
||||||
|
|
||||||
|
static Work workFromJson(Map<String, Object?> json) {
|
||||||
|
dynamic workTypeIndex = json["WorkTypeIndex"];
|
||||||
|
if (workTypeIndex is! int ||
|
||||||
|
workTypeIndex < 0 ||
|
||||||
|
workTypeIndex >= WorkEnum.values.length) {
|
||||||
|
throw "Cannot convert to a Work object.";
|
||||||
|
}
|
||||||
|
WorkEnum workType = WorkEnum.values[workTypeIndex];
|
||||||
|
switch (workType) {
|
||||||
|
case WorkEnum.ExplicitCreation:
|
||||||
|
return ExplicitCreationWork.fromJson(json);
|
||||||
|
case WorkEnum.Messages:
|
||||||
|
return MessagesWork.fromJson(json);
|
||||||
|
case WorkEnum.SpellingNotSource:
|
||||||
|
return SpellNotSourceWork.fromJson(json);
|
||||||
|
case WorkEnum.SpellingSource:
|
||||||
|
return SpellSourceWork.fromJson(json);
|
||||||
|
case WorkEnum.Lint:
|
||||||
|
return LintWork.fromJson(json);
|
||||||
|
case WorkEnum.DepsTest:
|
||||||
|
return DepsTestWork.fromJson(json);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
enum WorkEnum {
|
||||||
|
ExplicitCreation,
|
||||||
|
Messages,
|
||||||
|
SpellingNotSource,
|
||||||
|
SpellingSource,
|
||||||
|
Lint,
|
||||||
|
DepsTest,
|
||||||
|
}
|
||||||
|
|
||||||
|
extension on List<Work> {
|
||||||
|
void addIfNotNull(Work? element) {
|
||||||
|
if (element == null) return;
|
||||||
|
add(element);
|
||||||
|
}
|
||||||
|
}
|
211
pkg/front_end/presubmit_helper_spawn.dart
Normal file
211
pkg/front_end/presubmit_helper_spawn.dart
Normal file
|
@ -0,0 +1,211 @@
|
||||||
|
// Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
|
// for details. All rights reserved. Use of this source code is governed by a
|
||||||
|
// BSD-style license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:isolate' show Isolate, ReceivePort, SendPort;
|
||||||
|
|
||||||
|
import 'package:testing/src/log.dart' show Logger;
|
||||||
|
import 'package:testing/src/suite.dart';
|
||||||
|
import 'package:testing/testing.dart' as testing;
|
||||||
|
|
||||||
|
import 'presubmit_helper.dart';
|
||||||
|
import 'test/deps_git_test.dart' as deps_test;
|
||||||
|
import 'test/explicit_creation_impl.dart' show runExplicitCreationTest;
|
||||||
|
import 'test/fasta/messages_suite.dart' as messages_suite;
|
||||||
|
import 'test/lint_suite.dart' as lint_suite;
|
||||||
|
import 'test/spelling_test_not_src_suite.dart' as spelling_test_not_src;
|
||||||
|
import 'test/spelling_test_src_suite.dart' as spelling_test_src;
|
||||||
|
|
||||||
|
Future<void> main(List<String> args, [SendPort? sendPort]) async {
|
||||||
|
if (sendPort == null) throw "Need a send-port.";
|
||||||
|
var isolateReceivePort = ReceivePort();
|
||||||
|
isolateReceivePort.listen((rawData) async {
|
||||||
|
if (rawData is! String) {
|
||||||
|
print("Got unexpected data of type ${rawData.runtimeType}");
|
||||||
|
sendPort.send(false);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
Work work = Work.workFromJson(json.decode(rawData));
|
||||||
|
Stopwatch stopwatch = new Stopwatch()..start();
|
||||||
|
switch (work) {
|
||||||
|
case ExplicitCreationWork():
|
||||||
|
int explicitCreationErrorsFound = -1;
|
||||||
|
try {
|
||||||
|
explicitCreationErrorsFound = await Isolate.run(() =>
|
||||||
|
runExplicitCreationTest(
|
||||||
|
includedFiles: work.includedFiles,
|
||||||
|
includedDirectoryUris: work.includedDirectoryUris,
|
||||||
|
repoDir: work.repoDir));
|
||||||
|
} catch (e) {
|
||||||
|
// This will make it send false.
|
||||||
|
explicitCreationErrorsFound = -1;
|
||||||
|
}
|
||||||
|
print("Sending ok = ${explicitCreationErrorsFound == 0} "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(explicitCreationErrorsFound == 0);
|
||||||
|
|
||||||
|
case MessagesWork():
|
||||||
|
bool ok;
|
||||||
|
try {
|
||||||
|
ok = await Isolate.run(() async {
|
||||||
|
ErrorNotingLogger logger = new ErrorNotingLogger();
|
||||||
|
await testing.runMe(
|
||||||
|
const ["-DfastOnly=true"],
|
||||||
|
messages_suite.createContext,
|
||||||
|
me: work.repoDir
|
||||||
|
.resolve("pkg/front_end/test/fasta/messages_suite.dart"),
|
||||||
|
configurationPath: "../../testing.json",
|
||||||
|
logger: logger,
|
||||||
|
);
|
||||||
|
return !logger.gotFailure;
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
ok = false;
|
||||||
|
}
|
||||||
|
print("Sending ok = $ok "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(ok);
|
||||||
|
case SpellNotSourceWork():
|
||||||
|
bool ok;
|
||||||
|
try {
|
||||||
|
ok = await Isolate.run(() async {
|
||||||
|
ErrorNotingLogger logger = new ErrorNotingLogger();
|
||||||
|
await testing.runMe(
|
||||||
|
["--", ...work.filters],
|
||||||
|
spelling_test_not_src.createContext,
|
||||||
|
me: work.repoDir.resolve(
|
||||||
|
"pkg/front_end/test/spelling_test_not_src_suite.dart"),
|
||||||
|
configurationPath: "../testing.json",
|
||||||
|
logger: logger,
|
||||||
|
);
|
||||||
|
return !logger.gotFailure;
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
ok = false;
|
||||||
|
}
|
||||||
|
print("Sending ok = $ok "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(ok);
|
||||||
|
case SpellSourceWork():
|
||||||
|
bool ok;
|
||||||
|
try {
|
||||||
|
ok = await Isolate.run(() async {
|
||||||
|
ErrorNotingLogger logger = new ErrorNotingLogger();
|
||||||
|
await testing.runMe(
|
||||||
|
["--", ...work.filters],
|
||||||
|
spelling_test_src.createContext,
|
||||||
|
me: work.repoDir
|
||||||
|
.resolve("pkg/front_end/test/spelling_test_src_suite.dart"),
|
||||||
|
configurationPath: "../testing.json",
|
||||||
|
logger: logger,
|
||||||
|
);
|
||||||
|
return !logger.gotFailure;
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
ok = false;
|
||||||
|
}
|
||||||
|
print("Sending ok = $ok "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(ok);
|
||||||
|
case LintWork():
|
||||||
|
bool ok;
|
||||||
|
try {
|
||||||
|
ok = await Isolate.run(() async {
|
||||||
|
ErrorNotingLogger logger = new ErrorNotingLogger();
|
||||||
|
await testing.runMe(
|
||||||
|
["--", ...work.filters],
|
||||||
|
lint_suite.createContext,
|
||||||
|
me: work.repoDir.resolve("pkg/front_end/test/lint_suite.dart"),
|
||||||
|
configurationPath: "../testing.json",
|
||||||
|
logger: logger,
|
||||||
|
);
|
||||||
|
return !logger.gotFailure;
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
ok = false;
|
||||||
|
}
|
||||||
|
print("Sending ok = $ok "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(ok);
|
||||||
|
case DepsTestWork():
|
||||||
|
bool ok;
|
||||||
|
try {
|
||||||
|
ok = await Isolate.run(() {
|
||||||
|
return deps_test.main();
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
ok = false;
|
||||||
|
}
|
||||||
|
print("Sending ok = $ok "
|
||||||
|
"for ${work.name} after ${stopwatch.elapsed}");
|
||||||
|
sendPort.send(ok);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
sendPort.send(isolateReceivePort.sendPort);
|
||||||
|
}
|
||||||
|
|
||||||
|
class ErrorNotingLogger implements Logger {
|
||||||
|
bool gotFailure = false;
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logExpectedResult(Suite suite, testing.TestDescription description,
|
||||||
|
testing.Result result, Set<testing.Expectation> expectedOutcomes) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logMessage(Object message) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logNumberedLines(String text) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logProgress(String message) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logStepComplete(
|
||||||
|
int completed,
|
||||||
|
int failed,
|
||||||
|
int total,
|
||||||
|
Suite suite,
|
||||||
|
testing.TestDescription description,
|
||||||
|
testing.Step<dynamic, dynamic, testing.ChainContext> step) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logStepStart(
|
||||||
|
int completed,
|
||||||
|
int failed,
|
||||||
|
int total,
|
||||||
|
Suite suite,
|
||||||
|
testing.TestDescription description,
|
||||||
|
testing.Step<dynamic, dynamic, testing.ChainContext> step) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logSuiteComplete(Suite suite) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logSuiteStarted(Suite suite) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logTestComplete(int completed, int failed, int total, Suite suite,
|
||||||
|
testing.TestDescription description) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logTestStart(int completed, int failed, int total, Suite suite,
|
||||||
|
testing.TestDescription description) {}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logUncaughtError(error, StackTrace stackTrace) {
|
||||||
|
gotFailure = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void logUnexpectedResult(Suite suite, testing.TestDescription description,
|
||||||
|
testing.Result result, Set<testing.Expectation> expectedOutcomes) {
|
||||||
|
gotFailure = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void noticeFrameworkCatchError(error, StackTrace stackTrace) {
|
||||||
|
gotFailure = true;
|
||||||
|
}
|
||||||
|
}
|
|
@ -6,12 +6,14 @@ import 'dart:io';
|
||||||
|
|
||||||
import 'package:front_end/src/api_prototype/compiler_options.dart' as api;
|
import 'package:front_end/src/api_prototype/compiler_options.dart' as api;
|
||||||
import 'package:front_end/src/api_prototype/file_system.dart' as api;
|
import 'package:front_end/src/api_prototype/file_system.dart' as api;
|
||||||
|
import 'package:front_end/src/api_prototype/incremental_kernel_generator.dart';
|
||||||
import 'package:front_end/src/base/processed_options.dart';
|
import 'package:front_end/src/base/processed_options.dart';
|
||||||
import 'package:front_end/src/compute_platform_binaries_location.dart'
|
import 'package:front_end/src/compute_platform_binaries_location.dart'
|
||||||
show computePlatformBinariesLocation;
|
show computePlatformBinariesLocation;
|
||||||
import 'package:front_end/src/fasta/compiler_context.dart';
|
import 'package:front_end/src/fasta/compiler_context.dart';
|
||||||
import 'package:front_end/src/fasta/constant_context.dart';
|
import 'package:front_end/src/fasta/constant_context.dart';
|
||||||
import 'package:front_end/src/fasta/dill/dill_target.dart';
|
import 'package:front_end/src/fasta/dill/dill_target.dart';
|
||||||
|
import 'package:front_end/src/fasta/incremental_compiler.dart';
|
||||||
import 'package:front_end/src/fasta/kernel/body_builder.dart';
|
import 'package:front_end/src/fasta/kernel/body_builder.dart';
|
||||||
import 'package:front_end/src/fasta/kernel/body_builder_context.dart';
|
import 'package:front_end/src/fasta/kernel/body_builder_context.dart';
|
||||||
import 'package:front_end/src/fasta/kernel/kernel_target.dart';
|
import 'package:front_end/src/fasta/kernel/kernel_target.dart';
|
||||||
|
@ -54,6 +56,10 @@ api.CompilerOptions getOptions(
|
||||||
return options;
|
return options;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// [splitCompileAndCompileLess] Will use the incremental compiler to compile
|
||||||
|
/// an outline of everything, then compile the bodies of the [input]. This also
|
||||||
|
/// makes the compile pipeline skip transformations as for instance the VMs
|
||||||
|
/// mixin transformation isn't compatible (and will actively crash).
|
||||||
Future<BuildResult> compile(
|
Future<BuildResult> compile(
|
||||||
{required List<Uri> inputs,
|
{required List<Uri> inputs,
|
||||||
void Function(api.DiagnosticMessage message)? onDiagnostic,
|
void Function(api.DiagnosticMessage message)? onDiagnostic,
|
||||||
|
@ -62,7 +68,8 @@ Future<BuildResult> compile(
|
||||||
bool compileSdk = false,
|
bool compileSdk = false,
|
||||||
KernelTargetCreator kernelTargetCreator = KernelTargetTest.new,
|
KernelTargetCreator kernelTargetCreator = KernelTargetTest.new,
|
||||||
BodyBuilderCreator bodyBuilderCreator = defaultBodyBuilderCreator,
|
BodyBuilderCreator bodyBuilderCreator = defaultBodyBuilderCreator,
|
||||||
api.FileSystem? fileSystem}) async {
|
api.FileSystem? fileSystem,
|
||||||
|
bool splitCompileAndCompileLess = false}) async {
|
||||||
Ticker ticker = new Ticker(isVerbose: false);
|
Ticker ticker = new Ticker(isVerbose: false);
|
||||||
api.CompilerOptions compilerOptions = getOptions(
|
api.CompilerOptions compilerOptions = getOptions(
|
||||||
repoDir: repoDir,
|
repoDir: repoDir,
|
||||||
|
@ -76,30 +83,97 @@ Future<BuildResult> compile(
|
||||||
|
|
||||||
return await CompilerContext.runWithOptions(processedOptions,
|
return await CompilerContext.runWithOptions(processedOptions,
|
||||||
(CompilerContext c) async {
|
(CompilerContext c) async {
|
||||||
UriTranslator uriTranslator = await c.options.getUriTranslator();
|
if (splitCompileAndCompileLess) {
|
||||||
DillTarget dillTarget =
|
TestIncrementalCompiler outlineIncrementalCompiler =
|
||||||
new DillTarget(ticker, uriTranslator, c.options.target);
|
new TestIncrementalCompiler(bodyBuilderCreator, c, outlineOnly: true);
|
||||||
KernelTarget kernelTarget = kernelTargetCreator(
|
// Outline
|
||||||
c.fileSystem, false, dillTarget, uriTranslator, bodyBuilderCreator);
|
IncrementalCompilerResult outlineResult = await outlineIncrementalCompiler
|
||||||
|
.computeDelta(entryPoints: c.options.inputs);
|
||||||
|
print("Build outline of "
|
||||||
|
"${outlineResult.component.libraries.length} libraries");
|
||||||
|
|
||||||
Uri? platform = c.options.sdkSummary;
|
// Full of the asked inputs.
|
||||||
if (platform != null) {
|
TestIncrementalCompiler incrementalCompiler =
|
||||||
var bytes = new File.fromUri(platform).readAsBytesSync();
|
new TestIncrementalCompiler.fromComponent(
|
||||||
var platformComponent = loadComponentFromBytes(bytes);
|
bodyBuilderCreator, c, outlineResult.component);
|
||||||
dillTarget.loader
|
for (Uri uri in c.options.inputs) {
|
||||||
.appendLibraries(platformComponent, byteCount: bytes.length);
|
incrementalCompiler.invalidate(uri);
|
||||||
|
}
|
||||||
|
IncrementalCompilerResult result = await incrementalCompiler.computeDelta(
|
||||||
|
entryPoints: c.options.inputs, fullComponent: true);
|
||||||
|
print("Build bodies of "
|
||||||
|
"${incrementalCompiler.recorderForTesting.rebuildBodiesCount} "
|
||||||
|
"libraries.");
|
||||||
|
|
||||||
|
return new BuildResult(component: result.component);
|
||||||
|
} else {
|
||||||
|
UriTranslator uriTranslator = await c.options.getUriTranslator();
|
||||||
|
DillTarget dillTarget =
|
||||||
|
new DillTarget(ticker, uriTranslator, c.options.target);
|
||||||
|
KernelTarget kernelTarget = kernelTargetCreator(
|
||||||
|
c.fileSystem, false, dillTarget, uriTranslator, bodyBuilderCreator);
|
||||||
|
|
||||||
|
Uri? platform = c.options.sdkSummary;
|
||||||
|
if (platform != null) {
|
||||||
|
var bytes = new File.fromUri(platform).readAsBytesSync();
|
||||||
|
var platformComponent = loadComponentFromBytes(bytes);
|
||||||
|
dillTarget.loader
|
||||||
|
.appendLibraries(platformComponent, byteCount: bytes.length);
|
||||||
|
}
|
||||||
|
|
||||||
|
kernelTarget.setEntryPoints(c.options.inputs);
|
||||||
|
dillTarget.buildOutlines();
|
||||||
|
BuildResult buildResult = await kernelTarget.buildOutlines();
|
||||||
|
buildResult = await kernelTarget.buildComponent(
|
||||||
|
macroApplications: buildResult.macroApplications);
|
||||||
|
buildResult.macroApplications?.close();
|
||||||
|
return buildResult;
|
||||||
}
|
}
|
||||||
|
|
||||||
kernelTarget.setEntryPoints(c.options.inputs);
|
|
||||||
dillTarget.buildOutlines();
|
|
||||||
BuildResult buildResult = await kernelTarget.buildOutlines();
|
|
||||||
buildResult = await kernelTarget.buildComponent(
|
|
||||||
macroApplications: buildResult.macroApplications);
|
|
||||||
buildResult.macroApplications?.close();
|
|
||||||
return buildResult;
|
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
class TestIncrementalCompiler extends IncrementalCompiler {
|
||||||
|
final BodyBuilderCreator bodyBuilderCreator;
|
||||||
|
|
||||||
|
@override
|
||||||
|
final TestRecorderForTesting recorderForTesting =
|
||||||
|
new TestRecorderForTesting();
|
||||||
|
|
||||||
|
TestIncrementalCompiler(
|
||||||
|
this.bodyBuilderCreator,
|
||||||
|
CompilerContext context, {
|
||||||
|
Uri? initializeFromDillUri,
|
||||||
|
required bool outlineOnly,
|
||||||
|
}) : super(context, initializeFromDillUri, outlineOnly);
|
||||||
|
|
||||||
|
TestIncrementalCompiler.fromComponent(
|
||||||
|
this.bodyBuilderCreator, super.context, super._componentToInitializeFrom)
|
||||||
|
: super.fromComponent();
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool get skipExperimentalInvalidationChecksForTesting => true;
|
||||||
|
|
||||||
|
@override
|
||||||
|
IncrementalKernelTarget createIncrementalKernelTarget(
|
||||||
|
api.FileSystem fileSystem,
|
||||||
|
bool includeComments,
|
||||||
|
DillTarget dillTarget,
|
||||||
|
UriTranslator uriTranslator) {
|
||||||
|
return new KernelTargetTest(fileSystem, includeComments, dillTarget,
|
||||||
|
uriTranslator, bodyBuilderCreator)
|
||||||
|
..skipTransformations = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class TestRecorderForTesting extends RecorderForTesting {
|
||||||
|
int rebuildBodiesCount = 0;
|
||||||
|
|
||||||
|
@override
|
||||||
|
void recordRebuildBodiesCount(int count) {
|
||||||
|
rebuildBodiesCount = count;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
typedef KernelTargetCreator = KernelTargetTest Function(
|
typedef KernelTargetCreator = KernelTargetTest Function(
|
||||||
api.FileSystem fileSystem,
|
api.FileSystem fileSystem,
|
||||||
bool includeComments,
|
bool includeComments,
|
||||||
|
@ -107,22 +181,29 @@ typedef KernelTargetCreator = KernelTargetTest Function(
|
||||||
UriTranslator uriTranslator,
|
UriTranslator uriTranslator,
|
||||||
BodyBuilderCreator bodyBuilderCreator);
|
BodyBuilderCreator bodyBuilderCreator);
|
||||||
|
|
||||||
class KernelTargetTest extends KernelTarget {
|
class KernelTargetTest extends IncrementalKernelTarget {
|
||||||
final BodyBuilderCreator bodyBuilderCreator;
|
final BodyBuilderCreator bodyBuilderCreator;
|
||||||
|
bool skipTransformations = false;
|
||||||
|
|
||||||
KernelTargetTest(
|
KernelTargetTest(
|
||||||
api.FileSystem fileSystem,
|
api.FileSystem fileSystem,
|
||||||
bool includeComments,
|
bool includeComments,
|
||||||
DillTarget dillTarget,
|
DillTarget dillTarget,
|
||||||
UriTranslator uriTranslator,
|
UriTranslator uriTranslator,
|
||||||
this.bodyBuilderCreator)
|
this.bodyBuilderCreator,
|
||||||
: super(fileSystem, includeComments, dillTarget, uriTranslator);
|
) : super(fileSystem, includeComments, dillTarget, uriTranslator);
|
||||||
|
|
||||||
@override
|
@override
|
||||||
SourceLoader createLoader() {
|
SourceLoader createLoader() {
|
||||||
return new SourceLoaderTest(
|
return new SourceLoaderTest(
|
||||||
fileSystem, includeComments, this, bodyBuilderCreator);
|
fileSystem, includeComments, this, bodyBuilderCreator);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void runBuildTransformations() {
|
||||||
|
if (skipTransformations) return;
|
||||||
|
super.runBuildTransformations();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
class SourceLoaderTest extends SourceLoader {
|
class SourceLoaderTest extends SourceLoader {
|
||||||
|
|
|
@ -43,7 +43,8 @@ Set<String> allowlistedExternalDartFiles = {
|
||||||
"pkg/meta/lib/meta_meta.dart",
|
"pkg/meta/lib/meta_meta.dart",
|
||||||
};
|
};
|
||||||
|
|
||||||
Future<void> main() async {
|
/// Returns true on no errors and false if errors was found.
|
||||||
|
Future<bool> main() async {
|
||||||
Ticker ticker = new Ticker(isVerbose: false);
|
Ticker ticker = new Ticker(isVerbose: false);
|
||||||
CompilerOptions compilerOptions = getOptions();
|
CompilerOptions compilerOptions = getOptions();
|
||||||
|
|
||||||
|
@ -139,7 +140,9 @@ Future<void> main() async {
|
||||||
print(" - $uri");
|
print(" - $uri");
|
||||||
}
|
}
|
||||||
exitCode = 1;
|
exitCode = 1;
|
||||||
|
return false;
|
||||||
}
|
}
|
||||||
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
CompilerOptions getOptions() {
|
CompilerOptions getOptions() {
|
||||||
|
|
|
@ -4,44 +4,30 @@
|
||||||
|
|
||||||
import 'dart:io';
|
import 'dart:io';
|
||||||
|
|
||||||
import 'package:_fe_analyzer_shared/src/messages/severity.dart';
|
|
||||||
import 'package:_fe_analyzer_shared/src/scanner/token.dart';
|
|
||||||
import 'package:front_end/src/api_prototype/compiler_options.dart' as api;
|
|
||||||
import 'package:front_end/src/fasta/builder/declaration_builders.dart';
|
|
||||||
import 'package:front_end/src/fasta/builder/type_builder.dart';
|
|
||||||
import 'package:front_end/src/fasta/fasta_codes.dart' as fasta;
|
|
||||||
import 'package:front_end/src/fasta/kernel/body_builder.dart';
|
|
||||||
import 'package:front_end/src/fasta/kernel/constness.dart';
|
|
||||||
import 'package:front_end/src/fasta/kernel/expression_generator_helper.dart';
|
|
||||||
import 'package:kernel/kernel.dart';
|
|
||||||
|
|
||||||
import 'compiler_test_helper.dart';
|
|
||||||
import 'testing_utils.dart' show getGitFiles;
|
import 'testing_utils.dart' show getGitFiles;
|
||||||
import "utils/io_utils.dart";
|
import "utils/io_utils.dart" show computeRepoDirUri;
|
||||||
|
import "explicit_creation_impl.dart" show runExplicitCreationTest;
|
||||||
final Uri repoDir = computeRepoDirUri();
|
|
||||||
|
|
||||||
Set<Uri> libUris = {};
|
|
||||||
Set<Uri> ignoredLibUris = {};
|
|
||||||
|
|
||||||
int errorCount = 0;
|
|
||||||
|
|
||||||
Future<void> main(List<String> args) async {
|
Future<void> main(List<String> args) async {
|
||||||
ignoredLibUris.add(repoDir.resolve("pkg/frontend_server/test/fixtures/"));
|
final Uri repoDir = computeRepoDirUri();
|
||||||
|
|
||||||
|
Set<Uri> libUris = {};
|
||||||
|
|
||||||
if (args.isEmpty) {
|
if (args.isEmpty) {
|
||||||
libUris.add(repoDir.resolve("pkg/front_end/lib/"));
|
libUris.add(repoDir.resolve("pkg/front_end/lib/"));
|
||||||
libUris.add(repoDir.resolve("pkg/_fe_analyzer_shared/lib/"));
|
libUris.add(repoDir.resolve("pkg/_fe_analyzer_shared/lib/"));
|
||||||
libUris.add(repoDir.resolve("pkg/frontend_server/"));
|
libUris.add(repoDir.resolve("pkg/frontend_server/"));
|
||||||
} else {
|
} else {
|
||||||
if (args[0] == "--front-end-only") {
|
for (String arg in args) {
|
||||||
libUris.add(repoDir.resolve("pkg/front_end/lib/"));
|
if (arg == "--front-end-only") {
|
||||||
} else if (args[0] == "--shared-only") {
|
libUris.add(repoDir.resolve("pkg/front_end/lib/"));
|
||||||
libUris.add(repoDir.resolve("pkg/_fe_analyzer_shared/lib/"));
|
} else if (arg == "--shared-only") {
|
||||||
} else if (args[0] == "--frontend_server-only") {
|
libUris.add(repoDir.resolve("pkg/_fe_analyzer_shared/lib/"));
|
||||||
libUris.add(repoDir.resolve("pkg/frontend_server/"));
|
} else if (arg == "--frontend_server-only") {
|
||||||
} else {
|
libUris.add(repoDir.resolve("pkg/frontend_server/"));
|
||||||
throw "Unsupported arguments: $args";
|
} else {
|
||||||
|
throw "Unsupported arguments: $args";
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -58,104 +44,10 @@ Future<void> main(List<String> args) async {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
for (Uri uri in ignoredLibUris) {
|
|
||||||
List<FileSystemEntity> entities =
|
|
||||||
new Directory.fromUri(uri).listSync(recursive: true);
|
|
||||||
for (FileSystemEntity entity in entities) {
|
|
||||||
if (entity is File && entity.path.endsWith(".dart")) {
|
|
||||||
inputs.remove(entity.uri);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Uri packageConfigUri = repoDir.resolve(".dart_tool/package_config.json");
|
int explicitCreationErrorsFound = await runExplicitCreationTest(
|
||||||
if (!new File.fromUri(packageConfigUri).existsSync()) {
|
includedFiles: inputs, includedDirectoryUris: libUris, repoDir: repoDir);
|
||||||
throw "Couldn't find .dart_tool/package_config.json";
|
if (explicitCreationErrorsFound > 0) {
|
||||||
}
|
exitCode = 1;
|
||||||
|
|
||||||
Stopwatch stopwatch = new Stopwatch()..start();
|
|
||||||
|
|
||||||
await compile(
|
|
||||||
inputs: inputs.toList(),
|
|
||||||
// Compile sdk because when this is run from a lint it uses the checked-in
|
|
||||||
// sdk and we might not have a suitable compiled platform.dill file.
|
|
||||||
compileSdk: true,
|
|
||||||
packagesFileUri: packageConfigUri,
|
|
||||||
onDiagnostic: (api.DiagnosticMessage message) {
|
|
||||||
if (message.severity == Severity.error) {
|
|
||||||
print(message.plainTextFormatted.join('\n'));
|
|
||||||
errorCount++;
|
|
||||||
exitCode = 1;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
repoDir: repoDir,
|
|
||||||
bodyBuilderCreator: (
|
|
||||||
create: BodyBuilderTester.new,
|
|
||||||
createForField: BodyBuilderTester.forField,
|
|
||||||
createForOutlineExpression: BodyBuilderTester.forOutlineExpression
|
|
||||||
));
|
|
||||||
|
|
||||||
print("Done in ${stopwatch.elapsedMilliseconds} ms. "
|
|
||||||
"Found $errorCount errors.");
|
|
||||||
}
|
|
||||||
|
|
||||||
class BodyBuilderTester = BodyBuilderTest with BodyBuilderTestMixin;
|
|
||||||
|
|
||||||
mixin BodyBuilderTestMixin on BodyBuilder {
|
|
||||||
@override
|
|
||||||
Expression buildConstructorInvocation(
|
|
||||||
TypeDeclarationBuilder? type,
|
|
||||||
Token nameToken,
|
|
||||||
Token nameLastToken,
|
|
||||||
Arguments? arguments,
|
|
||||||
String name,
|
|
||||||
List<TypeBuilder>? typeArguments,
|
|
||||||
int charOffset,
|
|
||||||
Constness constness,
|
|
||||||
{bool isTypeArgumentsInForest = false,
|
|
||||||
TypeDeclarationBuilder? typeAliasBuilder,
|
|
||||||
required UnresolvedKind unresolvedKind}) {
|
|
||||||
Token maybeNewOrConst = nameToken.previous!;
|
|
||||||
bool doReport = true;
|
|
||||||
if (maybeNewOrConst is KeywordToken) {
|
|
||||||
if (maybeNewOrConst.lexeme == "new" ||
|
|
||||||
maybeNewOrConst.lexeme == "const") {
|
|
||||||
doReport = false;
|
|
||||||
}
|
|
||||||
} else if (maybeNewOrConst is SimpleToken) {
|
|
||||||
if (maybeNewOrConst.lexeme == "@") {
|
|
||||||
doReport = false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (doReport) {
|
|
||||||
bool match = false;
|
|
||||||
for (Uri libUri in libUris) {
|
|
||||||
if (uri.toString().startsWith(libUri.toString())) {
|
|
||||||
match = true;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (match) {
|
|
||||||
for (Uri libUri in ignoredLibUris) {
|
|
||||||
if (uri.toString().startsWith(libUri.toString())) {
|
|
||||||
match = false;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (!match) {
|
|
||||||
doReport = false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (doReport) {
|
|
||||||
addProblem(
|
|
||||||
fasta.templateUnspecified.withArguments("Should use new or const"),
|
|
||||||
nameToken.charOffset,
|
|
||||||
nameToken.length);
|
|
||||||
}
|
|
||||||
return super.buildConstructorInvocation(type, nameToken, nameLastToken,
|
|
||||||
arguments, name, typeArguments, charOffset, constness,
|
|
||||||
isTypeArgumentsInForest: isTypeArgumentsInForest,
|
|
||||||
unresolvedKind: unresolvedKind);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
163
pkg/front_end/test/explicit_creation_impl.dart
Normal file
163
pkg/front_end/test/explicit_creation_impl.dart
Normal file
|
@ -0,0 +1,163 @@
|
||||||
|
// Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
|
// for details. All rights reserved. Use of this source code is governed by a
|
||||||
|
// BSD-style license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
import 'dart:io' show File;
|
||||||
|
|
||||||
|
import 'package:_fe_analyzer_shared/src/messages/severity.dart' show Severity;
|
||||||
|
import 'package:_fe_analyzer_shared/src/scanner/token.dart'
|
||||||
|
show KeywordToken, SimpleToken, Token;
|
||||||
|
import 'package:front_end/src/api_prototype/compiler_options.dart' as api
|
||||||
|
show DiagnosticMessage;
|
||||||
|
import 'package:front_end/src/fasta/builder/declaration_builders.dart'
|
||||||
|
show TypeDeclarationBuilder;
|
||||||
|
import 'package:front_end/src/fasta/builder/type_builder.dart' show TypeBuilder;
|
||||||
|
import 'package:front_end/src/fasta/fasta_codes.dart' as fasta
|
||||||
|
show templateUnspecified;
|
||||||
|
import 'package:front_end/src/fasta/kernel/body_builder.dart' show BodyBuilder;
|
||||||
|
import 'package:front_end/src/fasta/kernel/constness.dart' show Constness;
|
||||||
|
import 'package:front_end/src/fasta/kernel/expression_generator_helper.dart'
|
||||||
|
show UnresolvedKind;
|
||||||
|
import 'package:front_end/src/fasta/kernel/kernel_target.dart' show BuildResult;
|
||||||
|
import 'package:kernel/kernel.dart' show Arguments, Expression;
|
||||||
|
|
||||||
|
import 'compiler_test_helper.dart' show BodyBuilderTest, compile;
|
||||||
|
|
||||||
|
Set<Uri> _includedDirectoryUris = {};
|
||||||
|
Set<Uri> _ignoredDirectoryUris = {};
|
||||||
|
|
||||||
|
/// Run the explicit creation test (i.e. reporting missing 'new' tokens).
|
||||||
|
///
|
||||||
|
/// Explicitly compiles [includedFiles], reporting only errors for files in a
|
||||||
|
/// path in [includedDirectoryUris] and not in [ignoredDirectoryUris].
|
||||||
|
/// Note that this means that there can be reported errors in files not
|
||||||
|
/// explicitly included in [includedFiles], although that is not guaranteed.
|
||||||
|
///
|
||||||
|
/// Returns the number of errors found.
|
||||||
|
Future<int> runExplicitCreationTest(
|
||||||
|
{required Set<Uri> includedFiles,
|
||||||
|
required Set<Uri> includedDirectoryUris,
|
||||||
|
required Uri repoDir}) async {
|
||||||
|
_includedDirectoryUris.clear();
|
||||||
|
_includedDirectoryUris.addAll(includedDirectoryUris);
|
||||||
|
_ignoredDirectoryUris.clear();
|
||||||
|
_ignoredDirectoryUris
|
||||||
|
.add(repoDir.resolve("pkg/frontend_server/test/fixtures/"));
|
||||||
|
int errorCount = 0;
|
||||||
|
|
||||||
|
Uri packageConfigUri = repoDir.resolve(".dart_tool/package_config.json");
|
||||||
|
if (!new File.fromUri(packageConfigUri).existsSync()) {
|
||||||
|
throw "Couldn't find .dart_tool/package_config.json";
|
||||||
|
}
|
||||||
|
|
||||||
|
Set<Uri> includedFilesFiltered = {};
|
||||||
|
for (Uri uri in includedFiles) {
|
||||||
|
bool include = true;
|
||||||
|
for (Uri ignoredDir in _ignoredDirectoryUris) {
|
||||||
|
if (uri.toString().startsWith(ignoredDir.toString())) {
|
||||||
|
include = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (include) {
|
||||||
|
includedFilesFiltered.add(uri);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Stopwatch stopwatch = new Stopwatch()..start();
|
||||||
|
|
||||||
|
// TODO(jensj): While we need to compile the outline as normal, it should be
|
||||||
|
// sufficient to compile the body of the paths mentioned in [includedFiles].
|
||||||
|
|
||||||
|
// TODO(jensj): The target has to be VM or we can't compile the sdk,
|
||||||
|
// but probably we don't actually need to run any vm-specific transformations
|
||||||
|
// for instance.
|
||||||
|
|
||||||
|
BuildResult result = await compile(
|
||||||
|
inputs: includedFilesFiltered.toList(),
|
||||||
|
// Compile sdk because when this is run from a lint it uses the checked-in
|
||||||
|
// sdk and we might not have a suitable compiled platform.dill file.
|
||||||
|
compileSdk: true,
|
||||||
|
packagesFileUri: packageConfigUri,
|
||||||
|
onDiagnostic: (api.DiagnosticMessage message) {
|
||||||
|
if (message.severity == Severity.error) {
|
||||||
|
print(message.plainTextFormatted.join('\n'));
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
repoDir: repoDir,
|
||||||
|
bodyBuilderCreator: (
|
||||||
|
create: BodyBuilderTester.new,
|
||||||
|
createForField: BodyBuilderTester.forField,
|
||||||
|
createForOutlineExpression: BodyBuilderTester.forOutlineExpression
|
||||||
|
),
|
||||||
|
splitCompileAndCompileLess: true);
|
||||||
|
|
||||||
|
print("Done in ${stopwatch.elapsedMilliseconds} ms. "
|
||||||
|
"Found $errorCount errors.");
|
||||||
|
|
||||||
|
print("Compiled ${result.component?.libraries.length} libraries.");
|
||||||
|
|
||||||
|
return errorCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
class BodyBuilderTester = BodyBuilderTest with BodyBuilderTestMixin;
|
||||||
|
|
||||||
|
mixin BodyBuilderTestMixin on BodyBuilder {
|
||||||
|
@override
|
||||||
|
Expression buildConstructorInvocation(
|
||||||
|
TypeDeclarationBuilder? type,
|
||||||
|
Token nameToken,
|
||||||
|
Token nameLastToken,
|
||||||
|
Arguments? arguments,
|
||||||
|
String name,
|
||||||
|
List<TypeBuilder>? typeArguments,
|
||||||
|
int charOffset,
|
||||||
|
Constness constness,
|
||||||
|
{bool isTypeArgumentsInForest = false,
|
||||||
|
TypeDeclarationBuilder? typeAliasBuilder,
|
||||||
|
required UnresolvedKind unresolvedKind}) {
|
||||||
|
Token maybeNewOrConst = nameToken.previous!;
|
||||||
|
bool doReport = true;
|
||||||
|
if (maybeNewOrConst is KeywordToken) {
|
||||||
|
if (maybeNewOrConst.lexeme == "new" ||
|
||||||
|
maybeNewOrConst.lexeme == "const") {
|
||||||
|
doReport = false;
|
||||||
|
}
|
||||||
|
} else if (maybeNewOrConst is SimpleToken) {
|
||||||
|
if (maybeNewOrConst.lexeme == "@") {
|
||||||
|
doReport = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (doReport) {
|
||||||
|
bool match = false;
|
||||||
|
for (Uri libUri in _includedDirectoryUris) {
|
||||||
|
if (uri.toString().startsWith(libUri.toString())) {
|
||||||
|
match = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (match) {
|
||||||
|
for (Uri libUri in _ignoredDirectoryUris) {
|
||||||
|
if (uri.toString().startsWith(libUri.toString())) {
|
||||||
|
match = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (!match) {
|
||||||
|
doReport = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (doReport) {
|
||||||
|
addProblem(
|
||||||
|
fasta.templateUnspecified.withArguments("Should use new or const"),
|
||||||
|
nameToken.charOffset,
|
||||||
|
nameToken.length);
|
||||||
|
}
|
||||||
|
return super.buildConstructorInvocation(type, nameToken, nameLastToken,
|
||||||
|
arguments, name, typeArguments, charOffset, constness,
|
||||||
|
isTypeArgumentsInForest: isTypeArgumentsInForest,
|
||||||
|
unresolvedKind: unresolvedKind);
|
||||||
|
}
|
||||||
|
}
|
|
@ -18,7 +18,8 @@ import 'utils/io_utils.dart' show computeRepoDirUri;
|
||||||
|
|
||||||
final Uri repoDir = computeRepoDirUri();
|
final Uri repoDir = computeRepoDirUri();
|
||||||
|
|
||||||
Future<void> main() async {
|
/// Returns true on no errors and false if errors was found.
|
||||||
|
Future<bool> main() async {
|
||||||
messages();
|
messages();
|
||||||
experimentalFlags();
|
experimentalFlags();
|
||||||
directParserAstHelper();
|
directParserAstHelper();
|
||||||
|
@ -27,6 +28,7 @@ Future<void> main() async {
|
||||||
AstModel astModel = await deriveAstModel(repoDir);
|
AstModel astModel = await deriveAstModel(repoDir);
|
||||||
await astEquivalence(astModel);
|
await astEquivalence(astModel);
|
||||||
await astCoverage(astModel);
|
await astCoverage(astModel);
|
||||||
|
return _checkFoundErrors == false;
|
||||||
}
|
}
|
||||||
|
|
||||||
void parserTestParser() {
|
void parserTestParser() {
|
||||||
|
@ -105,6 +107,8 @@ void messages() {
|
||||||
"dart pkg/front_end/tool/fasta.dart generate-messages");
|
"dart pkg/front_end/tool/fasta.dart generate-messages");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
bool _checkFoundErrors = false;
|
||||||
|
|
||||||
void check(String generated, Uri generatedFile, String run) {
|
void check(String generated, Uri generatedFile, String run) {
|
||||||
String actual = new File.fromUri(generatedFile)
|
String actual = new File.fromUri(generatedFile)
|
||||||
.readAsStringSync()
|
.readAsStringSync()
|
||||||
|
@ -122,5 +126,6 @@ is out of date. To regenerate the file, run
|
||||||
------------------------
|
------------------------
|
||||||
""");
|
""");
|
||||||
exitCode = 1;
|
exitCode = 1;
|
||||||
|
_checkFoundErrors = true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -30,6 +30,7 @@ aligns
|
||||||
allocations
|
allocations
|
||||||
allowlist
|
allowlist
|
||||||
allowlisting
|
allowlisting
|
||||||
|
alphabetically
|
||||||
alt
|
alt
|
||||||
amend
|
amend
|
||||||
amended
|
amended
|
||||||
|
@ -236,6 +237,7 @@ dijkstra
|
||||||
dijkstras
|
dijkstras
|
||||||
dinteractive
|
dinteractive
|
||||||
dirname
|
dirname
|
||||||
|
dirs
|
||||||
disagree
|
disagree
|
||||||
disagreement
|
disagreement
|
||||||
disconnect
|
disconnect
|
||||||
|
@ -525,9 +527,11 @@ nondefault
|
||||||
nonexisting
|
nonexisting
|
||||||
noo
|
noo
|
||||||
noted
|
noted
|
||||||
|
noting
|
||||||
nottest
|
nottest
|
||||||
nq
|
nq
|
||||||
null'ed
|
null'ed
|
||||||
|
numbered
|
||||||
numerator
|
numerator
|
||||||
nums
|
nums
|
||||||
ob
|
ob
|
||||||
|
@ -640,6 +644,7 @@ rendition
|
||||||
reorder
|
reorder
|
||||||
reordering
|
reordering
|
||||||
repaint
|
repaint
|
||||||
|
representative
|
||||||
repro
|
repro
|
||||||
reproduce
|
reproduce
|
||||||
reproduction
|
reproduction
|
||||||
|
@ -795,6 +800,7 @@ ugly
|
||||||
unassignment
|
unassignment
|
||||||
unawaited
|
unawaited
|
||||||
unbreak
|
unbreak
|
||||||
|
uncaught
|
||||||
unconverted
|
unconverted
|
||||||
uncover
|
uncover
|
||||||
uncovers
|
uncovers
|
||||||
|
@ -813,6 +819,7 @@ unusual
|
||||||
unversioned
|
unversioned
|
||||||
upgrade
|
upgrade
|
||||||
upload
|
upload
|
||||||
|
upstream
|
||||||
upward
|
upward
|
||||||
uuid
|
uuid
|
||||||
val
|
val
|
||||||
|
|
|
@ -426,7 +426,7 @@ const Code<Null> code$name = message$name;
|
||||||
|
|
||||||
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
||||||
const MessageCode message$name =
|
const MessageCode message$name =
|
||||||
const MessageCode(\"$name\", ${codeArguments.join(', ')});
|
const MessageCode(\"$name\", ${codeArguments.join(', ')},);
|
||||||
""", isShared: canBeShared);
|
""", isShared: canBeShared);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -450,13 +450,17 @@ const MessageCode message$name =
|
||||||
messageArguments
|
messageArguments
|
||||||
.add("correctionMessage: ${interpolate(correctionMessage)}");
|
.add("correctionMessage: ${interpolate(correctionMessage)}");
|
||||||
}
|
}
|
||||||
messageArguments.add("arguments: { ${arguments.join(', ')} }");
|
messageArguments.add("arguments: { ${arguments.join(', ')}, }");
|
||||||
|
|
||||||
|
if (codeArguments.isNotEmpty) {
|
||||||
|
codeArguments.add("");
|
||||||
|
}
|
||||||
|
|
||||||
return new Template("""
|
return new Template("""
|
||||||
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
||||||
const Template<Message Function(${parameters.join(', ')})> template$name =
|
const Template<Message Function(${parameters.join(', ')})> template$name =
|
||||||
const Template<Message Function(${parameters.join(', ')})>(
|
const Template<Message Function(${parameters.join(', ')})>(
|
||||||
${templateArguments.join(', ')});
|
${templateArguments.join(', ')},);
|
||||||
|
|
||||||
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
// DO NOT EDIT. THIS FILE IS GENERATED. SEE TOP OF FILE.
|
||||||
const Code<Message Function(${parameters.join(', ')})> code$name =
|
const Code<Message Function(${parameters.join(', ')})> code$name =
|
||||||
|
@ -468,7 +472,7 @@ Message _withArguments$name(${parameters.join(', ')}) {
|
||||||
${conversions.join('\n ')}
|
${conversions.join('\n ')}
|
||||||
return new Message(
|
return new Message(
|
||||||
code$name,
|
code$name,
|
||||||
${messageArguments.join(', ')});
|
${messageArguments.join(', ')},);
|
||||||
}
|
}
|
||||||
""", isShared: canBeShared);
|
""", isShared: canBeShared);
|
||||||
}
|
}
|
||||||
|
|
|
@ -566,9 +566,12 @@ Future<AstModel> deriveAstModel(Uri repoDir, {bool printDump = false}) async {
|
||||||
};
|
};
|
||||||
|
|
||||||
InternalCompilerResult compilerResult = (await kernelForProgramInternal(
|
InternalCompilerResult compilerResult = (await kernelForProgramInternal(
|
||||||
astLibraryUri, options,
|
astLibraryUri,
|
||||||
retainDataForTesting: true,
|
options,
|
||||||
requireMain: false)) as InternalCompilerResult;
|
retainDataForTesting: true,
|
||||||
|
requireMain: false,
|
||||||
|
buildComponent: false,
|
||||||
|
)) as InternalCompilerResult;
|
||||||
if (errorsFound) {
|
if (errorsFound) {
|
||||||
throw 'Errors found';
|
throw 'Errors found';
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# Copyright (c) 2023, the Dart project authors. Please see the AUTHORS file
|
# Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
# for details. All rights reserved. Use of this source code is governed by a
|
# for details. All rights reserved. Use of this source code is governed by a
|
||||||
# BSD-style license that can be found in the LICENSE file.
|
# BSD-style license that can be found in the LICENSE file.
|
||||||
"""frontend_server specific presubmit script.
|
"""CFE et al presubmit python script.
|
||||||
|
|
||||||
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
||||||
for more details about the presubmit API built into gcl.
|
for more details about the presubmit API built into gcl.
|
||||||
|
@ -30,45 +30,35 @@ def load_source(modname, filename):
|
||||||
|
|
||||||
|
|
||||||
def runSmokeTest(input_api, output_api):
|
def runSmokeTest(input_api, output_api):
|
||||||
hasChangedFiles = False
|
local_root = input_api.change.RepositoryRoot()
|
||||||
for git_file in input_api.AffectedTextFiles():
|
utils = load_source('utils', os.path.join(local_root, 'tools', 'utils.py'))
|
||||||
filename = git_file.AbsoluteLocalPath()
|
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
||||||
if filename.endswith(".dart"):
|
test_helper = os.path.join(local_root, 'pkg', 'front_end',
|
||||||
hasChangedFiles = True
|
'presubmit_helper.dart')
|
||||||
break
|
|
||||||
|
|
||||||
if hasChangedFiles:
|
windows = utils.GuessOS() == 'win32'
|
||||||
local_root = input_api.change.RepositoryRoot()
|
if windows:
|
||||||
utils = load_source('utils',
|
dart += '.exe'
|
||||||
os.path.join(local_root, 'tools', 'utils.py'))
|
|
||||||
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
|
||||||
smoke_test = os.path.join(local_root, 'pkg', 'frontend_server', 'test',
|
|
||||||
'quick_smoke_git_test.dart')
|
|
||||||
|
|
||||||
windows = utils.GuessOS() == 'win32'
|
if not os.path.isfile(dart):
|
||||||
if windows:
|
print('WARNING: dart not found: %s' % dart)
|
||||||
dart += '.exe'
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(dart):
|
if not os.path.isfile(test_helper):
|
||||||
print('WARNING: dart not found: %s' % dart)
|
print('WARNING: CFE et al presubmit_helper not found: %s' % test_helper)
|
||||||
return []
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(smoke_test):
|
args = [dart, test_helper, input_api.PresubmitLocalPath()]
|
||||||
print('WARNING: frontend_server smoke test not found: %s' %
|
process = subprocess.Popen(args,
|
||||||
smoke_test)
|
stdout=subprocess.PIPE,
|
||||||
return []
|
stdin=subprocess.PIPE)
|
||||||
|
outs, _ = process.communicate()
|
||||||
|
|
||||||
args = [dart, smoke_test]
|
if process.returncode != 0:
|
||||||
process = subprocess.Popen(args,
|
return [
|
||||||
stdout=subprocess.PIPE,
|
output_api.PresubmitError('CFE et al presubmit script failure(s):',
|
||||||
stdin=subprocess.PIPE)
|
long_text=outs)
|
||||||
outs, _ = process.communicate()
|
]
|
||||||
|
|
||||||
if process.returncode != 0:
|
|
||||||
return [
|
|
||||||
output_api.PresubmitError('Kernel smoke test failure(s):',
|
|
||||||
long_text=outs)
|
|
||||||
]
|
|
||||||
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
|
# Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
|
||||||
# for details. All rights reserved. Use of this source code is governed by a
|
# for details. All rights reserved. Use of this source code is governed by a
|
||||||
# BSD-style license that can be found in the LICENSE file.
|
# BSD-style license that can be found in the LICENSE file.
|
||||||
"""Kernel specific presubmit script.
|
"""CFE et al presubmit python script.
|
||||||
|
|
||||||
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
|
||||||
for more details about the presubmit API built into gcl.
|
for more details about the presubmit API built into gcl.
|
||||||
|
@ -30,42 +30,35 @@ def load_source(modname, filename):
|
||||||
|
|
||||||
|
|
||||||
def runSmokeTest(input_api, output_api):
|
def runSmokeTest(input_api, output_api):
|
||||||
hasChangedFiles = False
|
local_root = input_api.change.RepositoryRoot()
|
||||||
for git_file in input_api.AffectedTextFiles():
|
utils = load_source('utils', os.path.join(local_root, 'tools', 'utils.py'))
|
||||||
filename = git_file.AbsoluteLocalPath()
|
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
||||||
if filename.endswith(".dart"):
|
test_helper = os.path.join(local_root, 'pkg', 'front_end',
|
||||||
hasChangedFiles = True
|
'presubmit_helper.dart')
|
||||||
break
|
|
||||||
|
|
||||||
if hasChangedFiles:
|
windows = utils.GuessOS() == 'win32'
|
||||||
local_root = input_api.change.RepositoryRoot()
|
if windows:
|
||||||
utils = load_source('utils',
|
dart += '.exe'
|
||||||
os.path.join(local_root, 'tools', 'utils.py'))
|
|
||||||
dart = os.path.join(utils.CheckedInSdkPath(), 'bin', 'dart')
|
|
||||||
smoke_test = os.path.join(local_root, 'pkg', 'kernel', 'tool',
|
|
||||||
'smoke_test_quick.dart')
|
|
||||||
|
|
||||||
windows = utils.GuessOS() == 'win32'
|
if not os.path.isfile(dart):
|
||||||
if windows:
|
print('WARNING: dart not found: %s' % dart)
|
||||||
dart += '.exe'
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(dart):
|
if not os.path.isfile(test_helper):
|
||||||
print('WARNING: dart not found: %s' % dart)
|
print('WARNING: CFE et al presubmit_helper not found: %s' % test_helper)
|
||||||
return []
|
return []
|
||||||
|
|
||||||
if not os.path.isfile(smoke_test):
|
args = [dart, test_helper, input_api.PresubmitLocalPath()]
|
||||||
print('WARNING: kernel smoke test not found: %s' % smoke_test)
|
process = subprocess.Popen(args,
|
||||||
return []
|
stdout=subprocess.PIPE,
|
||||||
|
stdin=subprocess.PIPE)
|
||||||
|
outs, _ = process.communicate()
|
||||||
|
|
||||||
args = [dart, smoke_test]
|
if process.returncode != 0:
|
||||||
process = subprocess.Popen(
|
return [
|
||||||
args, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
|
output_api.PresubmitError('CFE et al presubmit script failure(s):',
|
||||||
outs, _ = process.communicate()
|
long_text=outs)
|
||||||
|
]
|
||||||
if process.returncode != 0:
|
|
||||||
return [output_api.PresubmitError(
|
|
||||||
'Kernel smoke test failure(s):',
|
|
||||||
long_text=outs)]
|
|
||||||
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue