mirror of
https://github.com/chromebrew/chromebrew.git
synced 2026-01-09 15:37:56 -05:00
Enable Cached Building on GitHub Actions— webkit2gtk_4_1 → 2.50.1 (#13001)
* Refactor and update webkit2gtk_4_1 Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add arm patch. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust env options Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add x86_64 build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust build settings. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust arm build options. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust arm build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust g++ in build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add cache_build plumbing. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add NESTED_CI detection plumbing to see if we are running in a container on GitHub Actions. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust download options for cached builds. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust timed kill to kill cmake. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust triggering of cache_build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Cleanup output. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Update cached build hash verification. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Rubyize #{build_cachefile}.sha256 write. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust documentation of cache_build trigger. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Also kill all ruby processes after finishing cache_build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Make cached build download info more useful. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add --regenerate-filelist option. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Fix downloader. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Try newer git commit. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * remove arm patch. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust hash checking for build downloads. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add message for checksum calculation since that can take a while. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add cached build restart code block. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add max build time to build workflow. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup buildsystems Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Set workflow max build time to 5.5 hours. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Indicate architectures for build in build workflow title. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust cached build uploading. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust workflow naming. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust installs after build. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust cached build logic. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * webkit => 2.50.1 Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust zstd options. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Move CREW_CACHE_DIR to /tmp in GitHub Action containers. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust build cache location. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * revert crew const variable changes. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust CREW_KERNEL_VERSION for CI usage. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Exclude @pkg.no_source_build? packages from cached builds. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Always create CREW_CACHE_DIR. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Clean up remnant @extract_dir folders from download command. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust permissions in workflow. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Sync up workflows. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add x86_64 binaries Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Cleanup workflows. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Do not use build cache if package binary exists. Signed-off-by: Satadru Pramanik <satadru@gmail.com> * webkit: Package File Update Run on linux/amd64 container. * webkit: Package File Update Run on linux/arm/v7 container. --------- Signed-off-by: Satadru Pramanik <satadru@gmail.com> Co-authored-by: satmandu <satmandu@users.noreply.github.com>
This commit is contained in:
committed by
GitHub
parent
ed8f069a77
commit
6b35c08b2e
197
bin/crew
197
bin/crew
@@ -81,6 +81,7 @@ String.use_color = args['--color'] || !args['--no-color']
|
||||
@opt_json = args['--json']
|
||||
@opt_keep = args['--keep']
|
||||
@opt_recursive = args['--recursive-build']
|
||||
@opt_regen_filelist = args['--regenerate-filelist']
|
||||
@opt_source = args['--source']
|
||||
@opt_update = args['--update-package-files']
|
||||
@opt_version = args['--version']
|
||||
@@ -93,6 +94,7 @@ String.use_color = args['--color'] || !args['--no-color']
|
||||
|
||||
# Make sure crew work directories exist.
|
||||
FileUtils.mkdir_p CREW_BREW_DIR
|
||||
FileUtils.mkdir_p CREW_CACHE_DIR
|
||||
FileUtils.mkdir_p CREW_DEST_DIR
|
||||
|
||||
class ExitMessage
|
||||
@@ -185,7 +187,8 @@ end
|
||||
|
||||
def cache_build
|
||||
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
||||
if CREW_CACHE_ENABLED && File.writable?(CREW_CACHE_DIR)
|
||||
puts "build_cachefile is #{build_cachefile}"
|
||||
if (@pkg.cache_build? || CREW_CACHE_BUILD) && File.writable?(CREW_CACHE_DIR)
|
||||
puts 'Caching build dir...'
|
||||
pkg_build_dirname_absolute = File.join(CREW_BREW_DIR, @extract_dir)
|
||||
pkg_build_dirname = File.basename(pkg_build_dirname_absolute)
|
||||
@@ -196,21 +199,53 @@ def cache_build
|
||||
FileUtils.mv build_cachefile, "#{build_cachefile}.bak", force: true if File.file?(build_cachefile)
|
||||
FileUtils.mv "#{build_cachefile}.sha256", "#{build_cachefile}.sha256.bak", force: true if File.file?("#{build_cachefile}.sha256")
|
||||
Dir.chdir(CREW_BREW_DIR) do
|
||||
# if ENV['NESTED_CI']
|
||||
## Directly upload if in a CI environment.
|
||||
# abort "\nGITLAB_TOKEN environment variable not set.\n".lightred if GITLAB_TOKEN.nil?
|
||||
# build_cache_url = "#{CREW_GITLAB_PKG_REPO}/generic/#{@pkg.name}/#{@pkg.version}_#{@device[:architecture]}_build/#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst"
|
||||
# output = `tar c #{pkg_build_dirname} \
|
||||
# | nice -n 20 zstd -T0 --stdout --ultra --fast -f - | \
|
||||
# curl -# --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" -F "file=@-;filename=#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst" #{build_cache_url} | cat`.chomp
|
||||
# if output.include?('201 Created')
|
||||
# puts "#{output}\n".lightgreen if output.include?('201 Created')
|
||||
# else
|
||||
# puts "#{output}\n".lightred
|
||||
# puts "tar c #{pkg_build_dirname} \
|
||||
# | nice -n 20 zstd -T0 --stdout --ultra --fast -f - | \
|
||||
# curl -# --header \"#{CREW_GITLAB_TOKEN_LABEL}: GITLAB_TOKEN\" -F \"file=@-;filename=#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst\" \"#{build_cache_url}\" | cat"
|
||||
# end
|
||||
# else
|
||||
puts `df -h`.chomp if ENV['NESTED_CI']
|
||||
@build_cachefile_lockfile = CrewLockfile.new "#{build_cachefile}.lock"
|
||||
begin
|
||||
@build_cachefile_lockfile.lock
|
||||
system "tar c#{@verbose} #{pkg_build_dirname} \
|
||||
| nice -n 20 zstd -c --ultra --fast -f -o #{build_cachefile} -"
|
||||
system "tar c #{pkg_build_dirname} \
|
||||
| nice -n 20 zstd -T0 --ultra --fast -f -o #{build_cachefile} -"
|
||||
ensure
|
||||
@build_cachefile_lockfile.unlock
|
||||
end
|
||||
# end
|
||||
end
|
||||
end
|
||||
system "sha256sum #{build_cachefile} > #{build_cachefile}.sha256"
|
||||
system "sha256sum #{build_cachefile} > #{build_cachefile}.sha256" if File.file?(build_cachefile)
|
||||
puts "Build directory cached at #{build_cachefile}".lightgreen
|
||||
if @pkg.cache_build? # && !ENV['NESTED_CI']
|
||||
abort "\nGITLAB_TOKEN environment variable not set.\n".lightred if GITLAB_TOKEN.nil?
|
||||
puts "Uploading #{build_cachefile} ...".orange
|
||||
build_cache_url = "#{CREW_GITLAB_PKG_REPO}/generic/#{@pkg.name}/#{@pkg.version}_#{@device[:architecture]}_build/#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst"
|
||||
puts "curl -# --header \"#{CREW_GITLAB_TOKEN_LABEL}: GITLAB_TOKEN\" --upload-file \"#{build_cachefile}\" \"#{build_cache_url}\" | cat" if CREW_VERBOSE
|
||||
output = `curl -# --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" --upload-file "#{build_cachefile}" "#{build_cache_url}" | cat`.chomp
|
||||
puts "\e[1A\e[KChecking upload...\r".orange
|
||||
if output.include?('201 Created')
|
||||
puts "#{output}\n".lightgreen if output.include?('201 Created')
|
||||
else
|
||||
puts "#{output}\n".lightred
|
||||
puts "curl -# --header \"#{CREW_GITLAB_TOKEN_LABEL}: GITLAB_TOKEN\" --upload-file \"#{build_cachefile}\" \"#{build_cache_url}\""
|
||||
end
|
||||
end
|
||||
else
|
||||
puts 'CREW_CACHE_ENABLED is not set.'.orange unless CREW_CACHE_ENABLED
|
||||
puts 'CREW_CACHE_DIR is not writable.'.lightred unless File.writable?(CREW_CACHE_DIR)
|
||||
puts 'CREW_CACHE_BUILD is not set.'.orange unless CREW_CACHE_BUILD || @pkg.cache_build?
|
||||
puts "#{CREW_CACHE_DIR} is not writable.".lightred unless File.writable?(CREW_CACHE_DIR)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -455,11 +490,11 @@ def download
|
||||
uri = URI.parse url
|
||||
filename = File.basename(uri.path)
|
||||
# # If we're downloading a binary, reset the filename to what it would have been if we didn't download from the API.
|
||||
filename = "#{@pkg.name}-#{@pkg.version}-chromeos-#{ARCH}.#{@pkg.binary_compression}" if filename.eql?('download')
|
||||
filename = "#{@pkg.name}-#{@pkg.version}-chromeos-#{@device[:architecture]}.#{@pkg.binary_compression}" if filename.eql?('download')
|
||||
@extract_dir = "#{@pkg.name}.#{Time.now.utc.strftime('%Y%m%d%H%M%S')}.dir"
|
||||
|
||||
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
||||
return { source:, filename: } if CREW_CACHE_BUILD && File.file?(build_cachefile) && !@pkg.built
|
||||
return { source:, filename: } if (CREW_CACHE_BUILD || @pkg.cache_build?) && File.file?(build_cachefile) && !@pkg.built
|
||||
|
||||
if !url
|
||||
abort "No precompiled binary or source is available for #{@device[:architecture]}.".lightred
|
||||
@@ -474,6 +509,37 @@ def download
|
||||
end
|
||||
|
||||
git = true unless @pkg.git_hashtag.to_s.empty?
|
||||
gitlab_binary_url = "#{CREW_GITLAB_PKG_REPO}/generic/#{@pkg.name}/#{@pkg.version}_#{@device[:architecture]}/#{@pkg.name}-#{@pkg.version}-chromeos-#{@device[:architecture]}.#{@pkg.binary_compression}"
|
||||
|
||||
if (@pkg.cache_build? || CREW_CACHE_BUILD) && !File.file?(build_cachefile) && !@pkg.no_source_build? && !File.file?(File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-chromeos-#{@device[:architecture]}.#{@pkg.binary_compression}")) && `curl -fsI #{gitlab_binary_url}`.lines.first.split[1] != '200'
|
||||
|
||||
build_cache_url = "#{CREW_GITLAB_PKG_REPO}/generic/#{@pkg.name}/#{@pkg.version}_#{@device[:architecture]}_build/#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst"
|
||||
puts 'Checking for cached build...'.orange
|
||||
# Does a remote build artifact exist?
|
||||
puts build_cache_url if CREW_VERBOSE
|
||||
puts "curl -fsI #{build_cache_url}" if CREW_VERBOSE
|
||||
if `curl -fsI #{build_cache_url}`.lines.first.split[1] == '200'
|
||||
puts "curl -# --header \"#{CREW_GITLAB_TOKEN_LABEL}: GITLAB_TOKEN\" --upload-file \"#{build_cachefile}\" \"#{build_cache_url}\" | cat" if CREW_VERBOSE
|
||||
# What is the gitlab package artifact binary PACKAGE_ID?
|
||||
gitlab_pkg_id = `curl --location --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" \
|
||||
"#{CREW_GITLAB_PKG_REPO}?package_type=generic&package_name=#{@pkg.name}&package_version=#{@pkg.version}_#{@device[:architecture]}_build" \
|
||||
| jq -r ".[] | select(.name==\\"#{@pkg.name}\\" and .version==\\"#{@pkg.version}_#{@device[:architecture]}_build\\") | .id"`.chomp
|
||||
# What is the hash of the gitlab package artifact binary?
|
||||
gitlab_build_artifact_sha256 = `curl --location --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" \
|
||||
"#{CREW_GITLAB_PKG_REPO}/#{gitlab_pkg_id}/package_files" \
|
||||
| jq -r "last(.[].file_sha256)"`.chomp
|
||||
gitlab_build_artifact_date = `curl --location --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" \
|
||||
"#{CREW_GITLAB_PKG_REPO}/#{gitlab_pkg_id}/package_files" \
|
||||
| jq -r "last(.[].created_at)"`.chomp
|
||||
puts "Cached build artifact from #{gitlab_build_artifact_date} exists!".lightgreen
|
||||
puts "Downloading most recent cached build artifact for #{@pkg.name}-#{@pkg.version}...".orange
|
||||
# Download the package build artifact.
|
||||
downloader(build_cache_url, gitlab_build_artifact_sha256, build_cachefile, no_update_hash: true)
|
||||
File.write "#{build_cachefile}.sha256", <<~BUILD_CACHEFILE_SHA256_EOF
|
||||
#{gitlab_build_artifact_sha256} #{build_cachefile}
|
||||
BUILD_CACHEFILE_SHA256_EOF
|
||||
end
|
||||
end
|
||||
|
||||
Dir.chdir CREW_BREW_DIR do
|
||||
FileUtils.mkdir_p @extract_dir
|
||||
@@ -488,17 +554,24 @@ def download
|
||||
# This also covers all precompiled binaries.
|
||||
when /\.zip$/i, /\.(tar(\.(gz|bz2|xz|lzma|lz|zst))?|tgz|tbz|tpxz|txz)$/i, /\.deb$/i, /\.AppImage$/i, /\.gem$/i
|
||||
# Recall file from cache if requested
|
||||
if CREW_CACHE_ENABLED
|
||||
if CREW_CACHE_ENABLED || CREW_CACHE_BUILD || @pkg.cache_build?
|
||||
puts "Looking for #{@pkg.name} archive in cache".orange if CREW_VERBOSE
|
||||
# Privilege CREW_LOCAL_BUILD_DIR over CREW_CACHE_DIR.
|
||||
local_build_cachefile = File.join(CREW_LOCAL_BUILD_DIR, filename)
|
||||
crew_cache_dir_cachefile = File.join(CREW_CACHE_DIR, filename)
|
||||
cachefile = File.file?(local_build_cachefile) ? local_build_cachefile : crew_cache_dir_cachefile
|
||||
puts "Using #{@pkg.name} archive from the build cache at #{cachefile}; The checksum will not be checked against the package file.".orange if cachefile.include?(CREW_LOCAL_BUILD_DIR)
|
||||
# puts "Using #{@pkg.name} archive from the build cache at #{cachefile}; The checksum will not be checked against the package file.".orange if cachefile.include?(CREW_LOCAL_BUILD_DIR)
|
||||
puts "Using #{@pkg.name} archive from the build cache at #{cachefile}".orange
|
||||
if File.file?(cachefile)
|
||||
puts "#{@pkg.name.capitalize} archive file exists in cache".lightgreen if CREW_VERBOSE
|
||||
# Don't check checksum if file is in the build cache.
|
||||
if Digest::SHA256.hexdigest(File.read(cachefile)) == sha256sum || sha256sum =~ /^SKIP$/i || cachefile.include?(CREW_LOCAL_BUILD_DIR)
|
||||
# Don't validate checksum if file is in the local build cache.
|
||||
if cachefile.include?(CREW_LOCAL_BUILD_DIR) || cachefile.include?(CREW_CACHE_DIR)
|
||||
sha256sum = 'SKIP'
|
||||
else
|
||||
sha256sum_out, _stderr, _status = Open3.capture3("sha256sum #{cachefile}")
|
||||
calc_sha256sum = sha256sum_out.split[0]
|
||||
end
|
||||
if sha256sum =~ /^SKIP$/i || calc_sha256sum == sha256sum
|
||||
begin
|
||||
# Hard link cached file if possible.
|
||||
FileUtils.ln cachefile, CREW_BREW_DIR, force: true, verbose: CREW_VERBOSE unless File.identical?(cachefile, "#{CREW_BREW_DIR}/#{filename}")
|
||||
@@ -627,10 +700,9 @@ def download
|
||||
@git_cachefile_lockfile = CrewLockfile.new "#{cachefile}.lock"
|
||||
begin
|
||||
@git_cachefile_lockfile.lock
|
||||
system "tar c#{@verbose} \
|
||||
system "tar c \
|
||||
$(#{CREW_PREFIX}/bin/find -mindepth 1 -maxdepth 1 -printf '%P\n') | \
|
||||
nice -n 20 zstd -c -T0 --ultra -20 - > \
|
||||
#{cachefile}"
|
||||
nice -n 20 zstd -T0 --ultra -20 -o #{cachefile} -"
|
||||
ensure
|
||||
@git_cachefile_lockfile.unlock
|
||||
end
|
||||
@@ -649,7 +721,7 @@ def unpack(meta)
|
||||
FileUtils.mkdir_p @extract_dir, verbose: CREW_VERBOSE
|
||||
|
||||
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
||||
if !@pkg.built && CREW_CACHE_BUILD && File.file?(build_cachefile) && File.file?("#{build_cachefile}.sha256") && (system "sha256sum -c #{build_cachefile}.sha256", chdir: CREW_CACHE_DIR)
|
||||
if !@pkg.built && (@pkg.cache_build? || CREW_CACHE_BUILD) && File.file?(build_cachefile) && File.file?("#{build_cachefile}.sha256") && (system "sha256sum -c #{build_cachefile}.sha256", chdir: CREW_CACHE_DIR)
|
||||
@pkg.cached_build = true
|
||||
puts "Extracting cached build directory from #{build_cachefile}".lightgreen
|
||||
system "tar -Izstd -x#{@verbose}f #{build_cachefile} -C #{CREW_BREW_DIR}", exception: true
|
||||
@@ -733,7 +805,20 @@ def build_and_preconfigure(target_dir)
|
||||
build_start_time = Time.now.to_i
|
||||
|
||||
@pkg.in_build = true
|
||||
unless @pkg.cached_build
|
||||
Signal.trap('INT') do
|
||||
if CREW_CACHE_BUILD || @pkg.cache_build?
|
||||
cache_build
|
||||
ExitMessage.add 'The build was interrupted. The build directory was cached.'.lightred
|
||||
exit 1
|
||||
end
|
||||
ExitMessage.add 'Interrupted!'.lightred
|
||||
exit 1
|
||||
end
|
||||
|
||||
@pkg.prebuild_config_and_report
|
||||
if @pkg.cache_build?
|
||||
@pkg.pre_cached_build
|
||||
else
|
||||
@pkg.patch
|
||||
@pkg.prebuild
|
||||
end
|
||||
@@ -741,17 +826,20 @@ def build_and_preconfigure(target_dir)
|
||||
begin
|
||||
@pkg.build
|
||||
rescue StandardError => e
|
||||
if CREW_CACHE_FAILED_BUILD
|
||||
if CREW_CACHE_FAILED_BUILD || @pkg.cache_build?
|
||||
cache_build
|
||||
abort "There was a build error, caching build directory.\n#{e}".lightred
|
||||
end
|
||||
abort "There was a build error.\n#{e}".lightred
|
||||
end
|
||||
@pkg.in_build = false
|
||||
Signal.trap('INT', 'DEFAULT') if CREW_CACHE_BUILD || @pkg.cache_build?
|
||||
|
||||
cache_build if CREW_CACHE_BUILD || @pkg.cache_build?
|
||||
|
||||
# wipe crew destdir
|
||||
FileUtils.rm_rf Dir["#{CREW_DEST_DIR}/*"], verbose: CREW_VERBOSE unless @pkg.superclass.to_s == 'RUBY'
|
||||
puts 'Preconfiguring package...'
|
||||
cache_build if CREW_CACHE_BUILD
|
||||
@pkg.install unless @pkg.superclass.to_s == 'RUBY'
|
||||
|
||||
build_end_time = Time.now.to_i
|
||||
@@ -856,7 +944,7 @@ def prepare_package(destdir)
|
||||
strip_dir destdir
|
||||
|
||||
# Create file list and calculate file size
|
||||
filelist = Dir[".{#{CREW_PREFIX}/**/{*,.?*/**},#{HOME}}/**/{*,.?*/**}"].select do |e|
|
||||
filelist = Dir[".{#{CREW_PREFIX}/**/{.*,*,.?*/**},#{HOME}}/**/{.*,*,.?*/**}"].select do |e|
|
||||
File.file?(e) || File.symlink?(e)
|
||||
end.to_h do |e|
|
||||
# Ignore symlinks to prevent duplicating calculation.
|
||||
@@ -1026,18 +1114,20 @@ end
|
||||
|
||||
def install_files(src, dst = File.join(CREW_PREFIX, src.delete_prefix('./usr/local')))
|
||||
if Dir.exist?(src)
|
||||
# Use tar if @opt_regen_filelist is set as that preserves dot files
|
||||
# after an install.
|
||||
crew_mvdir_error = false
|
||||
if File.executable?("#{CREW_PREFIX}/bin/crew-mvdir") && !CREW_DISABLE_MVDIR && !system("crew-mvdir #{@short_verbose} #{src} #{dst}")
|
||||
if !@opt_regen_filelist && File.executable?("#{CREW_PREFIX}/bin/crew-mvdir") && !CREW_DISABLE_MVDIR && !system("crew-mvdir #{@short_verbose} #{src} #{dst}")
|
||||
puts "src is #{src}".lightred
|
||||
puts "dst is #{dst}".lightred
|
||||
system "ls -aFl #{dst}"
|
||||
crew_mvdir_error = true
|
||||
end
|
||||
# Handle case of crew-mvdir having failed.
|
||||
if !File.executable?("#{CREW_PREFIX}/bin/crew-mvdir") || CREW_DISABLE_MVDIR || crew_mvdir_error
|
||||
warn 'crew-mvdir is not installed. Please install it with \'crew install crew_mvdir\' for improved installation performance'.yellow unless (@pkg.name == 'crew_mvdir') || CREW_DISABLE_MVDIR || crew_mvdir_error
|
||||
if !File.executable?("#{CREW_PREFIX}/bin/crew-mvdir") || CREW_DISABLE_MVDIR || crew_mvdir_error || @opt_regen_filelist
|
||||
warn 'crew-mvdir is not installed. Please install it with \'crew install crew_mvdir\' for improved installation performance'.yellow unless (@pkg.name == 'crew_mvdir') || CREW_DISABLE_MVDIR || crew_mvdir_error || @opt_regen_filelist
|
||||
warn 'crew-mvdir had an error. Using rsync.'.yellow if crew_mvdir_error
|
||||
if File.executable?("#{CREW_PREFIX}/bin/rsync") && system("#{CREW_PREFIX}/bin/rsync --version > /dev/null")
|
||||
if File.executable?("#{CREW_PREFIX}/bin/rsync") && system("#{CREW_PREFIX}/bin/rsync --version > /dev/null") && !@opt_regen_filelist
|
||||
# rsync src path needs a trailing slash
|
||||
src << '/' unless src.end_with?('/')
|
||||
# Check for ACLs support.
|
||||
@@ -1063,8 +1153,32 @@ def install_package(pkgdir)
|
||||
# install filelist, dlist and binary files
|
||||
puts "Performing install for #{@pkg.name}...".lightblue
|
||||
|
||||
FileUtils.mv 'dlist', File.join(CREW_META_PATH, "#{@pkg.name}.directorylist"), verbose: CREW_VERBOSE
|
||||
FileUtils.mv 'filelist', File.join(CREW_META_PATH, "#{@pkg.name}.filelist"), verbose: CREW_VERBOSE
|
||||
# Sometimes we want to regenerate a file list for an existing
|
||||
# package without forcing a rebuild.
|
||||
|
||||
if @opt_regen_filelist
|
||||
puts "Regenerating filelist for #{@pkg.name}...".orange
|
||||
# Create file list and calculate file size
|
||||
filelist = Dir[".{#{CREW_PREFIX}/**/{.*,*,.?*/**},#{HOME}}/**/{.*,*,.?*/**}"].select do |e|
|
||||
File.file?(e) || File.symlink?(e)
|
||||
end.to_h do |e|
|
||||
# Ignore symlinks to prevent duplicating calculation.
|
||||
[e[1..], File.symlink?(e) ? 0 : File.size(e)]
|
||||
end
|
||||
|
||||
File.write 'filelist', <<~EOF
|
||||
# Total size: #{filelist.values.sum}
|
||||
#{filelist.keys.sort.join("\n")}
|
||||
EOF
|
||||
|
||||
if Dir.exist?("#{CREW_LOCAL_REPO_ROOT}/manifest") && File.writable?("#{CREW_LOCAL_REPO_ROOT}/manifest")
|
||||
puts "Updating manifest filelist for #{@pkg.name}...".orange
|
||||
FileUtils.mkdir_p "#{CREW_LOCAL_REPO_ROOT}/manifest/#{ARCH}/#{@pkg.name.chr.downcase}"
|
||||
FileUtils.cp 'filelist', "#{CREW_LOCAL_REPO_ROOT}/manifest/#{ARCH}/#{@pkg.name.chr.downcase}/#{@pkg.name}.filelist"
|
||||
end
|
||||
end
|
||||
FileUtils.cp 'dlist', File.join(CREW_META_PATH, "#{@pkg.name}.directorylist"), verbose: CREW_VERBOSE
|
||||
FileUtils.cp 'filelist', File.join(CREW_META_PATH, "#{@pkg.name}.filelist"), verbose: CREW_VERBOSE
|
||||
|
||||
unless CREW_NOT_LINKS || @pkg.no_links?
|
||||
Find.find(Dir.pwd) do |path|
|
||||
@@ -1483,7 +1597,7 @@ def archive_package(crew_archive_dest)
|
||||
@pkg_name_lockfile = CrewLockfile.new "#{crew_archive_dest}/#{pkg_name}.lock"
|
||||
begin
|
||||
@pkg_name_lockfile.lock
|
||||
system "tar c#{@verbose} * | nice -n 20 zstd -c -T0 --ultra -20 - > #{crew_archive_dest}/#{pkg_name}"
|
||||
system "tar c * | nice -n 20 zstd -T0 --ultra -20 -o #{crew_archive_dest}/#{pkg_name} -"
|
||||
ensure
|
||||
@pkg_name_lockfile.unlock
|
||||
end
|
||||
@@ -1558,7 +1672,7 @@ def update_package_file(package = nil, pkg_version = nil, binary_compression = n
|
||||
end
|
||||
end
|
||||
|
||||
def upload(pkg_name = nil, pkg_version = nil, gitlab_token = nil, gitlab_token_username = nil, binary_compression = nil)
|
||||
def upload(pkg_name = nil, pkg_version = nil, binary_compression = nil)
|
||||
# Architecture independent:
|
||||
# 1. Abort early if package manifests exist but are empty, as this
|
||||
# likely indicates a failed build.
|
||||
@@ -1599,8 +1713,8 @@ def upload(pkg_name = nil, pkg_version = nil, gitlab_token = nil, gitlab_token_u
|
||||
# by build workflows to make sure updated manifests get
|
||||
# uploaded.)
|
||||
abort "\nPackage to be uploaded was not specified.\n".lightred if pkg_name.nil?
|
||||
abort "\nGITLAB_TOKEN environment variable not set.\n".lightred if gitlab_token.nil?
|
||||
abort "\nGITLAB_TOKEN_USERNAME environment variable not set.\n".lightred if gitlab_token_username.nil?
|
||||
abort "\nGITLAB_TOKEN environment variable not set.\n".lightred if GITLAB_TOKEN.nil?
|
||||
abort "\nGITLAB_TOKEN_USERNAME environment variable not set.\n".lightred if GITLAB_TOKEN_USERNAME.nil?
|
||||
|
||||
packages = pkg_name
|
||||
packages.strip!
|
||||
@@ -1741,10 +1855,9 @@ def upload(pkg_name = nil, pkg_version = nil, gitlab_token = nil, gitlab_token_u
|
||||
next unless upload_binary
|
||||
# 4. Upload.
|
||||
puts "Uploading #{local_tarfile} ...".orange if CREW_VERBOSE
|
||||
token_label = gitlab_token.split('-').first == 'glpat' ? 'PRIVATE-TOKEN' : 'DEPLOY-TOKEN'
|
||||
puts "curl -# --header \"#{token_label}: #{gitlab_token}\" --upload-file \"#{local_tarfile}\" \"#{new_url}\" | cat" if CREW_VERBOSE
|
||||
puts "curl -# --header \"#{CREW_GITLAB_TOKEN_LABEL}: GITLAB_TOKEN\" --upload-file \"#{local_tarfile}\" \"#{new_url}\" | cat" if CREW_VERBOSE
|
||||
puts "\e[1A\e[KUploading...\r".orange
|
||||
output = `curl -# --header "#{token_label}: #{gitlab_token}" --upload-file "#{local_tarfile}" "#{new_url}" | cat`.chomp
|
||||
output = `curl -# --header "#{CREW_GITLAB_TOKEN_LABEL}: #{GITLAB_TOKEN}" --upload-file "#{local_tarfile}" "#{new_url}" | cat`.chomp
|
||||
puts "\e[1A\e[KChecking upload...\r".orange
|
||||
if output.include?('201 Created')
|
||||
puts "curl -Ls #{new_url} | sha256sum" if CREW_VERBOSE
|
||||
@@ -1784,7 +1897,7 @@ def upload(pkg_name = nil, pkg_version = nil, gitlab_token = nil, gitlab_token_u
|
||||
puts "Uploading #{wheel}.\nNote that a '400 Bad Request' error here means the wheel has already been uploaded.".orange
|
||||
# Note that this uses the python twine from https://github.com/pypa/twine/pull/1123
|
||||
abort 'Twine is broken, cannot upload python wheels.'.lightred unless system('twine --help', %i[out err] => File::NULL)
|
||||
system("twine upload -u #{gitlab_token_username} -p #{gitlab_token} --repository-url #{CREW_GITLAB_PKG_REPO}/pypi --non-interactive #{wheel}", %i[err] => File::NULL)
|
||||
system("twine upload -u #{GITLAB_TOKEN_USERNAME} -p #{GITLAB_TOKEN} --repository-url #{CREW_GITLAB_PKG_REPO}/pypi --non-interactive #{wheel}", %i[err] => File::NULL)
|
||||
FileUtils.rm_f wheel
|
||||
end
|
||||
end
|
||||
@@ -1913,6 +2026,8 @@ def download_command(args)
|
||||
else
|
||||
download
|
||||
end
|
||||
# Clean up remnant @extract_dir folders.
|
||||
FileUtils.rm_rf File.join(CREW_BREW_DIR, @extract_dir)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -2029,13 +2144,11 @@ def update_package_file_command(args)
|
||||
end
|
||||
|
||||
def upload_command(args)
|
||||
gitlab_token = ENV.fetch('GITLAB_TOKEN', nil)
|
||||
gitlab_token_username = ENV.fetch('GITLAB_TOKEN_USERNAME', nil)
|
||||
args = { '<name>' => args.split } if args.is_a? String
|
||||
upload if args['<name>'].empty?
|
||||
args['<name>'].each do |name|
|
||||
search name
|
||||
upload(name, @pkg.version, gitlab_token, gitlab_token_username, @pkg.binary_compression)
|
||||
upload(name, @pkg.version, @pkg.binary_compression)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -2067,16 +2180,6 @@ end
|
||||
|
||||
def command?(name) = !!!name[/^[-<]/]
|
||||
|
||||
Signal.trap('INT') do
|
||||
if CREW_CACHE_FAILED_BUILD && CREW_CACHE_ENABLED && @pkg.in_build
|
||||
cache_build
|
||||
ExitMessage.add 'The build was interrupted. The build directory was cached.'.lightred
|
||||
exit 1
|
||||
end
|
||||
ExitMessage.add 'Interrupted!'.lightred
|
||||
exit 1
|
||||
end
|
||||
|
||||
@device = ConvenienceFunctions.load_symbolized_json
|
||||
|
||||
@last_update_check = Dir["#{CREW_LIB_PATH}/{.git/FETCH_HEAD,lib/const.rb}"].compact.map { |i| File.mtime(i).utc.to_i }.max
|
||||
|
||||
@@ -5,6 +5,8 @@ class Command
|
||||
case property
|
||||
when 'arch_flags_override'
|
||||
puts "Use the 'arch_flags_override' property to override architecture specific flags."
|
||||
when 'cache_build'
|
||||
puts "The 'cache_build' property caches build tree artifacts to checkpoint builds."
|
||||
when 'conflicts_ok'
|
||||
puts "The 'conflicts_ok' property bypasses checks for other package file conflicts."
|
||||
when 'git_clone_deep'
|
||||
|
||||
@@ -3,17 +3,25 @@ require_relative '../require_gem'
|
||||
require_relative '../report_buildsystem_methods'
|
||||
|
||||
class CMake < Package
|
||||
property :cmake_build_extras, :cmake_build_relative_dir, :cmake_install_extras, :cmake_options, :pre_cmake_options
|
||||
property :cmake_build_extras, :cmake_build_relative_dir, :cmake_install_extras, :cmake_options, :cmake_pre_cache_build_extras, :pre_cmake_options
|
||||
|
||||
def self.build
|
||||
def self.prebuild_config_and_report
|
||||
@cmake_build_relative_dir ||= '.'
|
||||
@crew_cmake_options = @no_lto ? CREW_CMAKE_OPTIONS.gsub('-ffat-lto-objects', '').gsub('-flto=auto', '').sub('-DCMAKE_INTERPROCEDURAL_OPTIMIZATION=TRUE', '') : CREW_CMAKE_OPTIONS
|
||||
|
||||
extend ReportBuildsystemMethods
|
||||
|
||||
print_buildsystem_methods
|
||||
end
|
||||
|
||||
system "#{@pre_cmake_options} cmake -S #{@cmake_build_relative_dir} -B #{@cmake_build_relative_dir}/builddir -G Ninja #{@crew_cmake_options} #{@cmake_options}"
|
||||
def self.pre_cached_build
|
||||
Dir.chdir(@cmake_build_relative_dir) do
|
||||
@cmake_pre_cache_build_extras&.call
|
||||
end
|
||||
end
|
||||
|
||||
def self.build
|
||||
system "#{@pre_cmake_options} cmake -S #{@cmake_build_relative_dir} -B #{@cmake_build_relative_dir}/builddir -G Ninja #{@crew_cmake_options} #{@cmake_options}" unless File.file?("#{@cmake_build_relative_dir}/build.ninja")
|
||||
system "#{CREW_PREFIX}/bin/jobserver_pool.py -j #{CREW_NPROC} #{CREW_NINJA} -C #{@cmake_build_relative_dir}/builddir"
|
||||
@cmake_build_extras&.call
|
||||
end
|
||||
|
||||
@@ -3,9 +3,9 @@ require_relative '../require_gem'
|
||||
require_relative '../report_buildsystem_methods'
|
||||
|
||||
class Meson < Package
|
||||
property :meson_build_relative_dir, :meson_options, :meson_build_extras, :meson_install_extras, :pre_meson_options
|
||||
property :meson_build_relative_dir, :meson_options, :meson_build_extras, :meson_install_extras, :meson_pre_cache_build_extras, :pre_meson_options
|
||||
|
||||
def self.build
|
||||
def self.prebuild_config_and_report
|
||||
@meson_build_relative_dir ||= '.'
|
||||
@crew_meson_options = @no_mold ? CREW_MESON_OPTIONS.gsub('-fuse-ld=mold', '') : CREW_MESON_OPTIONS
|
||||
@crew_meson_options.gsub!('-Db_lto=true', '-Db_lto=false').gsub!('-flto=auto', '') if @no_lto
|
||||
@@ -13,8 +13,16 @@ class Meson < Package
|
||||
extend ReportBuildsystemMethods
|
||||
|
||||
print_buildsystem_methods
|
||||
end
|
||||
|
||||
system "#{@pre_meson_options} meson setup #{@crew_meson_options} #{@meson_options} #{@meson_build_relative_dir}/builddir #{@meson_build_relative_dir}"
|
||||
def self.pre_cached_build
|
||||
Dir.chdir @meson_build_relative_dir do
|
||||
@meson_pre_cache_build_extras&.call
|
||||
end
|
||||
end
|
||||
|
||||
def self.build
|
||||
system "#{@pre_meson_options} meson setup #{@crew_meson_options} #{@meson_options} #{@meson_build_relative_dir}/builddir #{@meson_build_relative_dir}" unless File.file?("#{@meson_build_relative_dir}/build.ninja")
|
||||
system "meson configure --no-pager #{@meson_build_relative_dir}/builddir"
|
||||
system "#{CREW_PREFIX}/bin/jobserver_pool.py -j #{CREW_NPROC} #{CREW_NINJA} -C #{@meson_build_relative_dir}/builddir"
|
||||
@meson_build_extras&.call
|
||||
|
||||
25
lib/const.rb
25
lib/const.rb
@@ -89,16 +89,27 @@ end
|
||||
|
||||
# Use sane minimal defaults if in container and no override specified.
|
||||
CREW_KERNEL_VERSION ||=
|
||||
if CREW_IN_CONTAINER && ENV.fetch('CREW_KERNEL_VERSION', nil).nil?
|
||||
if (CREW_IN_CONTAINER && ENV.fetch('CREW_KERNEL_VERSION', nil).nil?) || ENV['CI']
|
||||
ARCH.eql?('i686') ? '3.8' : '6.12'
|
||||
else
|
||||
ENV.fetch('CREW_KERNEL_VERSION', Etc.uname[:release].rpartition('.').first)
|
||||
end
|
||||
|
||||
# Local constants for contributors.
|
||||
CREW_CACHE_DIR ||= ENV.fetch('CREW_CACHE_DIR', "#{HOME}/.cache/crewcache") unless defined?(CREW_CACHE_DIR)
|
||||
CREW_CACHE_FAILED_BUILD ||= ENV.fetch('CREW_CACHE_FAILED_BUILD', false) unless defined?(CREW_CACHE_FAILED_BUILD)
|
||||
CREW_CACHE_BUILD ||= ENV.fetch('CREW_CACHE_BUILD', false) unless defined?(CREW_CACHE_BUILD)
|
||||
CREW_LOCAL_REPO_ROOT ||= `git rev-parse --show-toplevel 2>/dev/null`.chomp
|
||||
CREW_LOCAL_BUILD_DIR ||= "#{CREW_LOCAL_REPO_ROOT}/release/#{ARCH}"
|
||||
CREW_MAX_BUILD_TIME ||= ENV.fetch('CREW_MAX_BUILD_TIME', '19800') unless defined?(CREW_MAX_BUILD_TIME) # GitHub Action containers are killed after 6 hours, so set to 5.5 hours.
|
||||
CREW_GITLAB_PKG_REPO ||= 'https://gitlab.com/api/v4/projects/26210301/packages'
|
||||
GITLAB_TOKEN ||= ENV.fetch('GITLAB_TOKEN', nil) unless defined?(GITLAB_TOKEN)
|
||||
GITLAB_TOKEN_USERNAME ||= ENV.fetch('GITLAB_TOKEN_USERNAME', nil) unless defined?(GITLAB_TOKEN_USERNAME)
|
||||
CREW_GITLAB_TOKEN_LABEL ||= if GITLAB_TOKEN.nil?
|
||||
''
|
||||
else
|
||||
(GITLAB_TOKEN.split('-').first == 'glpat' ? 'PRIVATE-TOKEN' : 'DEPLOY-TOKEN')
|
||||
end
|
||||
|
||||
CREW_LIB_PREFIX ||= File.join(CREW_PREFIX, ARCH_LIB)
|
||||
CREW_MAN_PREFIX ||= File.join(CREW_PREFIX, 'share/man')
|
||||
@@ -123,9 +134,6 @@ CREW_DEST_MUSL_PREFIX ||= File.join(CREW_DEST_DIR, CREW_MUSL_PREFIX)
|
||||
MUSL_LIBC_VERSION ||= File.executable?("#{CREW_MUSL_PREFIX}/lib/libc.so") ? `#{CREW_MUSL_PREFIX}/lib/libc.so 2>&1`[/\bVersion\s+\K\S+/] : nil unless defined?(MUSL_LIBC_VERSION)
|
||||
|
||||
CREW_DEST_HOME ||= File.join(CREW_DEST_DIR, HOME)
|
||||
CREW_CACHE_DIR ||= ENV.fetch('CREW_CACHE_DIR', "#{HOME}/.cache/crewcache") unless defined?(CREW_CACHE_DIR)
|
||||
CREW_CACHE_BUILD ||= ENV.fetch('CREW_CACHE_BUILD', false) unless defined?(CREW_CACHE_BUILD)
|
||||
CREW_CACHE_FAILED_BUILD ||= ENV.fetch('CREW_CACHE_FAILED_BUILD', false) unless defined?(CREW_CACHE_FAILED_BUILD)
|
||||
CREW_NO_GIT ||= ENV.fetch('CREW_NO_GIT', false) unless defined?(CREW_NO_GIT)
|
||||
CREW_UNATTENDED ||= ENV.fetch('CREW_UNATTENDED', false) unless defined?(CREW_UNATTENDED)
|
||||
|
||||
@@ -146,7 +154,7 @@ CREW_NPROC ||=
|
||||
# Set following as boolean if environment variables exist.
|
||||
# Timeout for agree questions in package.rb:
|
||||
CREW_AGREE_TIMEOUT_SECONDS ||= ENV.fetch('CREW_AGREE_TIMEOUT_SECONDS', 10).to_i unless defined?(CREW_AGREE_TIMEOUT_SECONDS)
|
||||
CREW_CACHE_ENABLED ||= ENV.fetch('CREW_CACHE_ENABLED', false) unless defined?(CREW_CACHE_ENABLED)
|
||||
CREW_CACHE_ENABLED ||= ENV.fetch('CREW_CACHE_ENABLED', CREW_CACHE_FAILED_BUILD) unless defined?(CREW_CACHE_ENABLED)
|
||||
CREW_CONFLICTS_ONLY_ADVISORY ||= ENV.fetch('CREW_CONFLICTS_ONLY_ADVISORY', false) unless defined?(CREW_CONFLICTS_ONLY_ADVISORY)
|
||||
# or use conflicts_ok
|
||||
CREW_DISABLE_ENV_OPTIONS ||= ENV.fetch('CREW_DISABLE_ENV_OPTIONS', false) unless defined?(CREW_DISABLE_ENV_OPTIONS)
|
||||
@@ -455,17 +463,17 @@ CREW_DOCOPT ||= <<~DOCOPT
|
||||
crew download [options] [-s|--source] [-v|--verbose] <name> ...
|
||||
crew files [options] <name> ...
|
||||
crew help [options] [<command>] [-v|--verbose] [<subcommand>]
|
||||
crew install [options] [-f|--force] [-k|--keep] [-s|--source] [-S|--recursive-build] [-v|--verbose] <name> ...
|
||||
crew install [options] [-f|--force] [-k|--keep] [--regenerate-filelist] [-s|--source] [-S|--recursive-build] [-v|--verbose] <name> ...
|
||||
crew list [options] [-v|--verbose] (available|compatible|incompatible|essential|installed)
|
||||
crew postinstall [options] [-v|--verbose] <name> ...
|
||||
crew prop [options] [<property>]
|
||||
crew reinstall [options] [-f|--force] [-k|--keep] [-s|--source] [-S|--recursive-build] [-v|--verbose] <name> ...
|
||||
crew reinstall [options] [-f|--force] [-k|--keep] [-s|--source] [--regenerate-filelist] [-S|--recursive-build] [-v|--verbose] <name> ...
|
||||
crew remove [options] [-f|--force] [-v|--verbose] <name> ...
|
||||
crew search [options] [-v|--verbose] <name> ...
|
||||
crew sysinfo [options] [-v|--verbose]
|
||||
crew update [options] [-v|--verbose]
|
||||
crew update_package_file [options] [-v|--verbose] [<name> ...]
|
||||
crew upgrade [options] [-f|--force] [-k|--keep] [-s|--source] [-v|--verbose] [<name> ...]
|
||||
crew upgrade [options] [-f|--force] [-k|--keep] [--regenerate-filelist] [-s|--source] [-v|--verbose] [<name> ...]
|
||||
crew upload [options] [-f|--force] [-v|--verbose] [<name> ...]
|
||||
crew upstream [options] [-j|--json|-u|--update-package-files|-v|--verbose] <name> ...
|
||||
crew version [options] [<name>]
|
||||
@@ -482,6 +490,7 @@ CREW_DOCOPT ||= <<~DOCOPT
|
||||
-L --license Display the crew license.
|
||||
-s --source Build or download from source even if pre-compiled binary exists.
|
||||
-S --recursive-build Build from source, including all dependencies, even if pre-compiled binaries exist.
|
||||
--regenerate-filelist Force regeneration of package filelists on install.
|
||||
-t --tree Print dependencies in a tree-structure format.
|
||||
-u --update-package-files Attempt to update the package version.
|
||||
-v --verbose Show extra information.
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
require 'digest/sha2'
|
||||
require 'io/console'
|
||||
require 'open3'
|
||||
require 'uri'
|
||||
require_relative 'const'
|
||||
require_relative 'color'
|
||||
@@ -25,7 +25,7 @@ rescue RuntimeError => e
|
||||
end
|
||||
end
|
||||
|
||||
def downloader(url, sha256sum, filename = File.basename(url), verbose: false)
|
||||
def downloader(url, sha256sum, filename = File.basename(url), no_update_hash: false, verbose: false)
|
||||
# downloader: wrapper for all Chromebrew downloaders (`net/http`,`curl`...)
|
||||
# Usage: downloader <url>, <sha256sum>, <filename::optional>, <verbose::optional>
|
||||
#
|
||||
@@ -37,6 +37,9 @@ def downloader(url, sha256sum, filename = File.basename(url), verbose: false)
|
||||
puts "downloader(#{url}, #{sha256sum}, #{filename}, #{verbose})" if verbose
|
||||
uri = URI(url)
|
||||
|
||||
# Make sure the destination dir for the filename exists.
|
||||
FileUtils.mkdir_p File.dirname(filename)
|
||||
|
||||
if CREW_USE_CURL || !ENV['CREW_DOWNLOADER'].to_s.empty?
|
||||
# force using external downloader if either CREW_USE_CURL or ENV['CREW_DOWNLOADER'] is set
|
||||
puts "external_downloader(#{uri}, #{filename}, #{verbose})" if verbose
|
||||
@@ -61,11 +64,15 @@ def downloader(url, sha256sum, filename = File.basename(url), verbose: false)
|
||||
end
|
||||
end
|
||||
|
||||
# verify with given checksum
|
||||
calc_sha256sum = Digest::SHA256.hexdigest(File.read(filename))
|
||||
# Verify with given checksum, using the external sha256sum binary so
|
||||
# we do not load the entire file into ruby's process, which throws
|
||||
# errors with large files on 32-bit architectures.
|
||||
puts "Calculating checksum for #{filename}...".lightblue
|
||||
sha256sum_out, _stderr, _status = Open3.capture3("sha256sum #{filename}")
|
||||
calc_sha256sum = sha256sum_out.split[0]
|
||||
|
||||
unless (sha256sum =~ /^SKIP$/i) || (calc_sha256sum == sha256sum)
|
||||
if CREW_FORCE
|
||||
if CREW_FORCE && !no_update_hash
|
||||
pkg_name = @pkg_name.blank? ? name : @pkg_name
|
||||
puts "Updating checksum for #{filename}".lightblue
|
||||
puts "from #{sha256sum} to #{calc_sha256sum}".lightblue
|
||||
|
||||
@@ -26,35 +26,77 @@ def agree_with_default(yes_or_no_question_msg, character = nil, default:)
|
||||
end
|
||||
|
||||
class Package
|
||||
boolean_property :arch_flags_override, :conflicts_ok, :git_clone_deep,
|
||||
:git_fetchtags, :gem_compile_needed, :gnome,
|
||||
:ignore_updater, :is_fake, :is_musl, :is_static,
|
||||
:no_binaries_needed, :no_compile_needed,
|
||||
:no_compress, :no_env_options, :no_fhs, :no_filefix,
|
||||
:no_git_submodules, :no_links, :no_lto, :no_mold,
|
||||
:no_patchelf, :no_shrink, :no_source_build,
|
||||
:no_strip, :no_upstream_update, :no_zstd, :patchelf,
|
||||
:prerelease, :print_source_bashrc, :run_tests
|
||||
boolean_property :arch_flags_override,
|
||||
:cache_build, # Cache build to gitlab.
|
||||
:conflicts_ok,
|
||||
:git_clone_deep,
|
||||
:git_fetchtags,
|
||||
:gem_compile_needed,
|
||||
:gnome,
|
||||
:ignore_updater,
|
||||
:is_fake,
|
||||
:is_musl,
|
||||
:is_static,
|
||||
:no_binaries_needed,
|
||||
:no_compile_needed,
|
||||
:no_compress,
|
||||
:no_env_options,
|
||||
:no_fhs,
|
||||
:no_filefix,
|
||||
:no_git_submodules,
|
||||
:no_links,
|
||||
:no_lto,
|
||||
:no_mold,
|
||||
:no_patchelf,
|
||||
:no_shrink,
|
||||
:no_source_build,
|
||||
:no_strip,
|
||||
:no_upstream_update,
|
||||
:no_zstd,
|
||||
:patchelf,
|
||||
:prerelease,
|
||||
:print_source_bashrc,
|
||||
:run_tests
|
||||
|
||||
property :description, :homepage, :version, :license, :compatibility,
|
||||
:binary_compression, :binary_url, :binary_sha256, :source_url, :source_sha256,
|
||||
:git_branch, :git_hashtag, :max_glibc, :min_glibc
|
||||
property :description,
|
||||
:homepage,
|
||||
:version,
|
||||
:license,
|
||||
:compatibility,
|
||||
:binary_compression,
|
||||
:binary_url,
|
||||
:binary_sha256,
|
||||
:source_url,
|
||||
:source_sha256,
|
||||
:git_branch,
|
||||
:git_hashtag,
|
||||
:max_glibc,
|
||||
:min_glibc
|
||||
|
||||
create_placeholder :preflight, # Function for checks to see if install should occur.
|
||||
:patch, # Function to perform patch operations prior to build from source.
|
||||
:prebuild, # Function to perform pre-build operations prior to build from source.
|
||||
:build, # Function to perform build from source.
|
||||
:postbuild, # Function to perform post-build for both source build and binary distribution.
|
||||
:check, # Function to perform check from source build. (executes only during `crew build`)
|
||||
:preinstall, # Function to perform pre-install operations prior to install.
|
||||
:install, # Function to perform install from source build.
|
||||
:postinstall, # Function to perform post-install for both source build and binary distribution.
|
||||
:preremove, # Function to perform prior to package removal.
|
||||
:remove, # Function to remove package.
|
||||
:postremove # Function to perform after package removal.
|
||||
create_placeholder :preflight, # Function for checks to see if install should occur.
|
||||
:patch, # Function to perform patch operations prior to build from source.
|
||||
:pre_cache_build, # Function to perform pre-build operations prior to starting & restarting a cached build.
|
||||
:prebuild, # Function to perform pre-build operations prior to build from source.
|
||||
:prebuild_config_and_report, # Function to add some reporting for buildsystems.
|
||||
:build, # Function to perform build from source.
|
||||
:postbuild, # Function to perform post-build for both source build and binary distribution.
|
||||
:check, # Function to perform check from source build. (executes only during `crew build`)
|
||||
:preinstall, # Function to perform pre-install operations prior to install.
|
||||
:install, # Function to perform install from source build.
|
||||
:postinstall, # Function to perform post-install for both source build and binary distribution.
|
||||
:preremove, # Function to perform prior to package removal.
|
||||
:remove, # Function to remove package.
|
||||
:postremove # Function to perform after package removal.
|
||||
|
||||
class << self
|
||||
attr_accessor :build_from_source, :built, :cached_build, :in_build, :in_install, :in_upgrade, :missing_binaries, :name
|
||||
attr_accessor :build_from_source,
|
||||
:built,
|
||||
:cached_build,
|
||||
:in_build,
|
||||
:in_install,
|
||||
:in_upgrade,
|
||||
:missing_binaries,
|
||||
:name
|
||||
end
|
||||
|
||||
def self.agree_default_no(message = nil)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# Total size: 102267430
|
||||
/usr/local/bin/WebKitWebDriver_4.1
|
||||
/usr/local/include/webkitgtk-4.1/JavaScriptCore/JSBase.h
|
||||
/usr/local/include/webkitgtk-4.1/JavaScriptCore/JSContextRef.h
|
||||
@@ -89,6 +90,7 @@
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebEditor.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtension.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtensionAutocleanups.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtensionMatchPattern.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebFormManager.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebHitTestResult.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebInspector.h
|
||||
@@ -223,15 +225,16 @@
|
||||
/usr/local/lib/girepository-1.0/WebKit2WebExtension-4.1.typelib
|
||||
/usr/local/lib/libjavascriptcoregtk-4.1.so
|
||||
/usr/local/lib/libjavascriptcoregtk-4.1.so.0
|
||||
/usr/local/lib/libjavascriptcoregtk-4.1.so.0.5.7
|
||||
/usr/local/lib/libjavascriptcoregtk-4.1.so.0.9.6
|
||||
/usr/local/lib/libwebkit2gtk-4.1.so
|
||||
/usr/local/lib/libwebkit2gtk-4.1.so.0
|
||||
/usr/local/lib/libwebkit2gtk-4.1.so.0.13.7
|
||||
/usr/local/lib/libwebkit2gtk-4.1.so.0.19.4
|
||||
/usr/local/lib/pkgconfig/javascriptcoregtk-4.1.pc
|
||||
/usr/local/lib/pkgconfig/webkit2gtk-4.1.pc
|
||||
/usr/local/lib/pkgconfig/webkit2gtk-web-extension-4.1.pc
|
||||
/usr/local/lib/webkit2gtk-4.1/injected-bundle/libwebkit2gtkinjectedbundle.so
|
||||
/usr/local/libexec/webkit2gtk-4.1/MiniBrowser
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitGPUProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitNetworkProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitWebProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/jsc
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# Total size: 146107520
|
||||
/usr/local/bin/WebKitWebDriver_4.1
|
||||
/usr/local/include/webkitgtk-4.1/JavaScriptCore/JSBase.h
|
||||
/usr/local/include/webkitgtk-4.1/JavaScriptCore/JSContextRef.h
|
||||
@@ -89,6 +90,7 @@
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebEditor.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtension.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtensionAutocleanups.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebExtensionMatchPattern.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebFormManager.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebHitTestResult.h
|
||||
/usr/local/include/webkitgtk-4.1/webkit/WebKitWebInspector.h
|
||||
@@ -223,15 +225,16 @@
|
||||
/usr/local/lib64/girepository-1.0/WebKit2WebExtension-4.1.typelib
|
||||
/usr/local/lib64/libjavascriptcoregtk-4.1.so
|
||||
/usr/local/lib64/libjavascriptcoregtk-4.1.so.0
|
||||
/usr/local/lib64/libjavascriptcoregtk-4.1.so.0.5.7
|
||||
/usr/local/lib64/libjavascriptcoregtk-4.1.so.0.9.6
|
||||
/usr/local/lib64/libwebkit2gtk-4.1.so
|
||||
/usr/local/lib64/libwebkit2gtk-4.1.so.0
|
||||
/usr/local/lib64/libwebkit2gtk-4.1.so.0.13.7
|
||||
/usr/local/lib64/libwebkit2gtk-4.1.so.0.19.4
|
||||
/usr/local/lib64/pkgconfig/javascriptcoregtk-4.1.pc
|
||||
/usr/local/lib64/pkgconfig/webkit2gtk-4.1.pc
|
||||
/usr/local/lib64/pkgconfig/webkit2gtk-web-extension-4.1.pc
|
||||
/usr/local/lib64/webkit2gtk-4.1/injected-bundle/libwebkit2gtkinjectedbundle.so
|
||||
/usr/local/libexec/webkit2gtk-4.1/MiniBrowser
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitGPUProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitNetworkProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/WebKitWebProcess
|
||||
/usr/local/libexec/webkit2gtk-4.1/jsc
|
||||
|
||||
@@ -1,32 +1,36 @@
|
||||
require 'package'
|
||||
require 'buildsystems/cmake'
|
||||
|
||||
class Webkit2gtk_4_1 < Package
|
||||
class Webkit2gtk_4_1 < CMake
|
||||
description 'Web content engine for GTK'
|
||||
homepage 'https://webkitgtk.org'
|
||||
version "2.44.3-#{CREW_ICU_VER}"
|
||||
version '2.50.1'
|
||||
license 'LGPL-2+ and BSD-2'
|
||||
compatibility 'aarch64 armv7l x86_64'
|
||||
source_url 'https://webkitgtk.org/releases/webkitgtk-2.44.3.tar.xz'
|
||||
source_sha256 'dc82d042ecaca981a4852357c06e5235743319cf10a94cd36ad41b97883a0b54'
|
||||
source_url 'https://github.com/WebKit/WebKit.git'
|
||||
git_hashtag "webkitgtk-#{version}"
|
||||
# From the webkitglib/2.50 branch.
|
||||
# git_hashtag '84fb745f2dec1b1f6417ee6059e76a60f6a4f96b'
|
||||
# source_url "https://webkitgtk.org/releases/webkitgtk-#{version.split('-').first}.tar.xz"
|
||||
# source_sha256 'e564b8099f9a3ae32409539b290bbd2ad084e99b6d22d4aac5e51e4554df8bc2'
|
||||
binary_compression 'tar.zst'
|
||||
|
||||
binary_sha256({
|
||||
aarch64: '6c220476148a9b263f78e1a9dfc95562bcf8635fbcc403f2c461bbbf7b5f465f',
|
||||
armv7l: '6c220476148a9b263f78e1a9dfc95562bcf8635fbcc403f2c461bbbf7b5f465f',
|
||||
x86_64: '80b969a2d1a82d55852f5026aefac336c8092d8d8c3628450e7fb1a47121e3de'
|
||||
aarch64: 'ca91b3dab02d6204eaadab28efbffa85dff186eb02ca7acef28bfc838cc385e6',
|
||||
armv7l: 'ca91b3dab02d6204eaadab28efbffa85dff186eb02ca7acef28bfc838cc385e6',
|
||||
x86_64: '5665205aa7b768e4234fee8800b19f8ec52736009b4f136d6af30d9e75394395'
|
||||
})
|
||||
|
||||
depends_on 'at_spi2_core' # R
|
||||
depends_on 'cairo' # R
|
||||
depends_on 'ccache' => :build
|
||||
depends_on 'dav1d' => :build
|
||||
depends_on 'enchant' # R
|
||||
depends_on 'expat' # R
|
||||
depends_on 'fontconfig'
|
||||
depends_on 'freetype' # R
|
||||
depends_on 'gcc_lib' # R
|
||||
depends_on 'gdk_pixbuf' # R
|
||||
depends_on 'glibc' # R
|
||||
depends_on 'glib' # R
|
||||
depends_on 'glibc' # R
|
||||
depends_on 'gobject_introspection' => :build
|
||||
depends_on 'gstreamer' # R
|
||||
depends_on 'gtk3' # R
|
||||
@@ -40,7 +44,6 @@ class Webkit2gtk_4_1 < Package
|
||||
depends_on 'libdrm' # R
|
||||
depends_on 'libepoxy' # R
|
||||
depends_on 'libgcrypt' # R
|
||||
depends_on 'libglvnd' # R
|
||||
depends_on 'libgpg_error' # R
|
||||
depends_on 'libjpeg_turbo' # R
|
||||
depends_on 'libjxl' # R
|
||||
@@ -50,80 +53,99 @@ class Webkit2gtk_4_1 < Package
|
||||
depends_on 'libsoup' # R
|
||||
depends_on 'libtasn1' # R
|
||||
depends_on 'libwebp' # R
|
||||
depends_on 'libwpe' # R
|
||||
depends_on 'libx11' # R
|
||||
depends_on 'libxcomposite' # R
|
||||
depends_on 'libxdamage' # R
|
||||
depends_on 'libxml2' # R
|
||||
depends_on 'libxrender' # R
|
||||
depends_on 'libxslt' # R
|
||||
depends_on 'libxt' # R
|
||||
depends_on 'mesa' # R
|
||||
depends_on 'openjpeg' # R
|
||||
depends_on 'pango' # R
|
||||
depends_on 'py3_gi_docgen' => :build
|
||||
depends_on 'py3_smartypants' => :build
|
||||
depends_on 'sqlite' # R
|
||||
depends_on 'sysprof' => :build
|
||||
depends_on 'unifdef' => :build
|
||||
depends_on 'valgrind' => :build
|
||||
depends_on 'vulkan_icd_loader'
|
||||
depends_on 'wayland' # R
|
||||
depends_on 'woff2' # R
|
||||
depends_on 'wpebackend_fdo' # R
|
||||
depends_on 'zlib' # R
|
||||
|
||||
cache_build
|
||||
no_env_options
|
||||
no_lto
|
||||
|
||||
def self.patch
|
||||
# Fix incompatibility with gtk3 from current gobjects_introspection causing build failure
|
||||
# as per https://bugs.webkit.org/show_bug.cgi?id=276180 .
|
||||
downloader 'https://github.com/WebKit/WebKit/pull/30446.diff', '6beda7960b232117f8445db4e588a45ef384d42ccb13f6926b695c472a4eea51'
|
||||
system 'patch -Np1 -i 30446.diff'
|
||||
system "sed -i 's,/usr/bin,/usr/local/bin,g' Source/JavaScriptCore/inspector/scripts/codegen/preprocess.pl"
|
||||
@arch_flags = ''
|
||||
@gcc_ver = ''
|
||||
@arch_flags = '-mfloat-abi=hard -mtls-dialect=gnu -mthumb -mfpu=vfpv3-d16 -mlibarch=armv7-a+fp -march=armv7-a+fp' if ARCH == 'armv7l' || ARCH == 'aarch64'
|
||||
end
|
||||
|
||||
def self.build
|
||||
# This builds webkit2gtk4_1 (which uses gtk3, but not libsoup2)
|
||||
@workdir = Dir.pwd
|
||||
# Bubblewrap sandbox breaks on epiphany with
|
||||
# bwrap: Can't make symlink at /var/run: File exists
|
||||
# LDFLAGS from debian: -Wl,--no-keep-memory
|
||||
unless File.file?('build.ninja')
|
||||
@arch_linker_flags = ARCH == 'x86_64' ? '' : '-Wl,--no-keep-memory'
|
||||
system "CREW_LINKER_FLAGS='#{@arch_linker_flags}' \
|
||||
pre_cmake_options "CC=#{CREW_PREFIX}/bin/gcc CXX=#{CREW_PREFIX}/bin/g++"
|
||||
cmake_options "-DCMAKE_BUILD_WITH_INSTALL_RPATH=ON \
|
||||
-DENABLE_BUBBLEWRAP_SANDBOX=OFF \
|
||||
-DENABLE_DOCUMENTATION=OFF \
|
||||
-DENABLE_GAMEPAD=OFF \
|
||||
-DENABLE_JOURNALD_LOG=OFF \
|
||||
-DENABLE_MINIBROWSER=ON \
|
||||
-DENABLE_SPEECH_SYNTHESIS=OFF \
|
||||
-DPORT=GTK \
|
||||
-DPYTHON_EXECUTABLE=`which python` \
|
||||
-DUSER_AGENT_BRANDING='Chromebrew' \
|
||||
-DUSE_GTK4=OFF \
|
||||
-DUSE_JPEGXL=ON \
|
||||
-DUSE_SOUP2=OFF"
|
||||
cmake_pre_cache_build_extras do
|
||||
# This only works in the container.
|
||||
system "sudo ln -sf #{CREW_PREFIX}/bin/gcc /usr/bin/gcc" if CREW_IN_CONTAINER
|
||||
system "sudo ln -sf #{CREW_PREFIX}/bin/g++ /usr/bin/g++" if CREW_IN_CONTAINER
|
||||
if ARCH == 'armv7l'
|
||||
@arch_flags = '-mfloat-abi=hard -mtls-dialect=gnu -mthumb -mfpu=vfpv3-d16 -mlibarch=armv7-a+fp -march=armv7-a+fp'
|
||||
@new_gcc = <<~NEW_GCCEOF
|
||||
#!/bin/bash
|
||||
gcc #{@arch_flags} $@
|
||||
NEW_GCCEOF
|
||||
@new_gpp = <<~NEW_GPPEOF
|
||||
#!/bin/bash
|
||||
# See https://wiki.debian.org/ReduceBuildMemoryOverhead
|
||||
g++ #{@arch_flags} --param ggc-min-expand=10 $@
|
||||
# g++ #{@arch_flags} $@
|
||||
NEW_GPPEOF
|
||||
FileUtils.mkdir_p 'bin'
|
||||
File.write('bin/gcc', @new_gcc)
|
||||
FileUtils.chmod 0o755, 'bin/gcc'
|
||||
File.write('bin/g++', @new_gpp)
|
||||
FileUtils.chmod 0o755, 'bin/g++'
|
||||
end
|
||||
end
|
||||
|
||||
if ARCH == 'armv7l'
|
||||
def self.build
|
||||
# This builds webkit2gtk4_1 (which uses gtk3, but not libsoup2)
|
||||
@workdir = Dir.pwd
|
||||
# Bubblewrap sandbox breaks on epiphany with
|
||||
# bwrap: Can't make symlink at /var/run: File exists
|
||||
# LDFLAGS from debian: -Wl,--no-keep-memory
|
||||
unless File.file?('build.ninja')
|
||||
@arch_linker_flags = ARCH == 'x86_64' ? '' : '-Wl,--no-keep-memory'
|
||||
system "CC='#{@workdir}/bin/gcc' CXX='#{@workdir}/bin/g++' CREW_LINKER_FLAGS='#{@arch_linker_flags}' \
|
||||
cmake -B builddir -G Ninja \
|
||||
#{CREW_CMAKE_OPTIONS.sub('-pipe', '-pipe -Wno-error').gsub('-flto=auto', '').sub('-DCMAKE_INTERPROCEDURAL_OPTIMIZATION=TRUE', '')} \
|
||||
-DCMAKE_BUILD_WITH_INSTALL_RPATH=ON \
|
||||
-DENABLE_BUBBLEWRAP_SANDBOX=OFF \
|
||||
-DENABLE_DOCUMENTATION=OFF \
|
||||
-DENABLE_JOURNALD_LOG=OFF \
|
||||
-DENABLE_GAMEPAD=OFF \
|
||||
-DENABLE_JOURNALD_LOG=OFF \
|
||||
-DENABLE_MINIBROWSER=ON \
|
||||
-DUSE_SYSTEM_MALLOC=ON \
|
||||
-DENABLE_SPEECH_SYNTHESIS=OFF \
|
||||
-DPORT=GTK \
|
||||
-DPYTHON_EXECUTABLE=`which python` \
|
||||
-DUSER_AGENT_BRANDING='Chromebrew' \
|
||||
-DUSE_GTK4=OFF \
|
||||
-DUSE_JPEGXL=ON \
|
||||
-DUSE_SOUP2=OFF \
|
||||
-DPYTHON_EXECUTABLE=`which python` \
|
||||
-DUSER_AGENT_BRANDING='Chromebrew'"
|
||||
end
|
||||
@counter = 1
|
||||
@counter_max = 20
|
||||
loop do
|
||||
break if Kernel.system "#{CREW_NINJA} -C builddir -j #{CREW_NPROC}"
|
||||
|
||||
puts "Make iteration #{@counter} of #{@counter_max}...".orange
|
||||
|
||||
@counter += 1
|
||||
break if @counter > @counter_max
|
||||
-DUSE_SOUP2=OFF"
|
||||
end
|
||||
Kernel.system "#{CREW_NINJA} -C builddir -j #{CREW_NPROC}"
|
||||
end
|
||||
end
|
||||
|
||||
def self.install
|
||||
system "DESTDIR=#{CREW_DEST_DIR} #{CREW_NINJA} -C builddir install"
|
||||
cmake_install_extras do
|
||||
FileUtils.mv "#{CREW_DEST_PREFIX}/bin/WebKitWebDriver", "#{CREW_DEST_PREFIX}/bin/WebKitWebDriver_4.1"
|
||||
end
|
||||
end
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
#!/usr/local/bin/ruby
|
||||
# build_updated_packages version 3.4 (for Chromebrew)
|
||||
# build_updated_packages version 3.5 (for Chromebrew)
|
||||
# This updates the versions in python pip packages by calling
|
||||
# tools/update_python_pip_packages.rb, checks for updated ruby packages
|
||||
# by calling tools/update_ruby_gem_packages.rb, and then checks if any
|
||||
@@ -160,7 +160,19 @@ updated_packages.each do |pkg|
|
||||
puts "#{name.capitalize} #{@pkg_obj.version} needs builds uploaded for: #{builds_needed.join(' ')}".lightblue
|
||||
|
||||
if builds_needed.include?(ARCH) && !File.file?("release/#{ARCH}/#{name}-#{@pkg_obj.version}-chromeos-#{ARCH}.#{@pkg_obj.binary_compression}") && agree_default_yes("\nWould you like to build #{name} #{@pkg_obj.version}")
|
||||
system "yes | nice -n 20 crew build -f #{pkg}"
|
||||
# GitHub actions are killed after 6 hours, so eed to force
|
||||
# creation of build artifacts for long-running builds.
|
||||
if ENV['NESTED_CI']
|
||||
# Sleep for CREW_MAX_BUILD_TIME seconds, then send SIGINT to
|
||||
# @pkg.build, which should trigger a build artifact upload.
|
||||
puts "It is #{Time.now}."
|
||||
puts "Will kill the build of #{name.capitalize} after #{Float(format('%.2g', CREW_MAX_BUILD_TIME.to_f / 3600))} hours at #{Time.at(Time.now.to_i + CREW_MAX_BUILD_TIME.to_i)}."
|
||||
actions_timed_killer = fork do
|
||||
exec "sleep #{CREW_MAX_BUILD_TIME}; killall -s INT crew; sleep 300; killall ruby"
|
||||
end
|
||||
Process.detach(actions_timed_killer)
|
||||
end
|
||||
system "yes | #{'CREW_CACHE_BUILD=1' if ENV['NESTED_CI']} nice -n 20 crew build -f #{pkg}"
|
||||
build[name.to_sym] = $CHILD_STATUS.success?
|
||||
unless build[name.to_sym]
|
||||
if CONTINUE_AFTER_FAILED_BUILDS
|
||||
@@ -170,6 +182,7 @@ updated_packages.each do |pkg|
|
||||
abort "#{pkg} build failed!".lightred
|
||||
end
|
||||
end
|
||||
Process.kill('HUP', actions_timed_killer) if ENV['NESTED_CI']
|
||||
# Reinvoke this script to take just built packages that have been built and
|
||||
# installed into account, attempting uploads of just built packages immediately.
|
||||
cmdline = "cd #{`pwd`.chomp} && crew upload #{name} ; #{$PROGRAM_NAME} #{ARGV.join(' ')}"
|
||||
|
||||
Reference in New Issue
Block a user