mirror of
https://github.com/chromebrew/chromebrew.git
synced 2026-01-07 22:54:11 -05:00
* amtk renamed to gedit_amtk Signed-off-by: Satadru Pramanik <satadru@gmail.com> * rename amtk to libgedit_amtk Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add binaries for py3_lxml Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add libgedit_ packages Signed-off-by: Satadru Pramanik <satadru@gmail.com> * initial icu4c commit, update other gedit files Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more adjustments to icu4c package file Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more package updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * alphabetize pkg_update_arr in fixup.rb Signed-off-by: Satadru Pramanik <satadru@gmail.com> * bump version Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update postgres package file Signed-off-by: Satadru Pramanik <satadru@gmail.com> * postgres => 16.2 Signed-off-by: Satadru Pramanik <satadru@gmail.com> * remove postgres i686 filelist Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update tcl i686 binary Signed-off-by: Satadru Pramanik <satadru@gmail.com> * rebuild gspell, gedit Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add hunspell to gedit deps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add js115 package Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add js115 binaries Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add gnome rebuilds Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust gnome boolean * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add gnome to libxml2 Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add gnome postinstall to other buildsystems Signed-off-by: Satadru Pramanik <satadru@gmail.com> * adjust gnome postinstall Signed-off-by: Satadru Pramanik <satadru@gmail.com> * rebuild librsvg Signed-off-by: Satadru Pramanik <satadru@gmail.com> * simply postinstall logic Signed-off-by: Satadru Pramanik <satadru@gmail.com> * adjust postinstall Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Allow buildsystems source for postinstalls Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Also update mime db in gnome postinstall Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add librsvg to gtk logical deps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add gnome postinstall to gtk[3,4] Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add updated but not working blender 4 build Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add blender files Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update inkscape Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update mesa Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add binaries for inkscape, mesa Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add new packages to packages.yaml Signed-off-by: Satadru Pramanik <satadru@gmail.com> * refactor gnome postinstall function Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more refactoring Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Add more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update more packages Signed-off-by: Satadru Pramanik <satadru@gmail.com> * move gnome function to lib/gnome.rb Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Update other gnome affiliated packages Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add binaries for adwaita_icon_theme Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update libpng Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * cleanup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update license string Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * suggested changes Signed-off-by: Satadru Pramanik <satadru@gmail.com> * change gnome logic to boolean Signed-off-by: Satadru Pramanik <satadru@gmail.com> * suggested changes Signed-off-by: Satadru Pramanik <satadru@gmail.com> * suggested changes Signed-off-by: Satadru Pramanik <satadru@gmail.com> * updates and suggested changes Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update gnome_console Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add gnome to gimp Signed-off-by: Satadru Pramanik <satadru@gmail.com> * adjust gimp deps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * cleanup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update expat Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Use gnome packages count to determine whether gnome postinstalls are run Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fixup Signed-off-by: Satadru Pramanik <satadru@gmail.com> * Adjust ruby gem version Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update svt_av1, libotify Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update libgee Signed-off-by: Satadru Pramanik <satadru@gmail.com> * update more packages, revert from non-working 0.20.6 libgee build Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add missing deps to libgee Signed-off-by: Satadru Pramanik <satadru@gmail.com> * gsound Signed-off-by: Satadru Pramanik <satadru@gmail.com> * gnome-weather Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more gnome updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * deprecate gnome_settings_daemon Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more builds Signed-off-by: Satadru Pramanik <satadru@gmail.com> * remove gtk2 from ibus deps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * gnome_maps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates Signed-off-by: Satadru Pramanik <satadru@gmail.com> * more updates, and also update glib binaries Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * updates, use oxford comma Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fix deps Signed-off-by: Satadru Pramanik <satadru@gmail.com> * add binaries Signed-off-by: Satadru Pramanik <satadru@gmail.com> * fix spacing Signed-off-by: Satadru Pramanik <satadru@gmail.com> * nautilus Signed-off-by: Satadru Pramanik <satadru@gmail.com> * lint Signed-off-by: Satadru Pramanik <satadru@gmail.com> * readd amtk Signed-off-by: Satadru Pramanik <satadru@gmail.com> * rhythmbox for arm build succeeds Signed-off-by: Satadru Pramanik <satadru@gmail.com> --------- Signed-off-by: Satadru Pramanik <satadru@gmail.com>
2087 lines
81 KiB
Ruby
Executable File
2087 lines
81 KiB
Ruby
Executable File
#!/usr/bin/env ruby
|
|
require 'uri'
|
|
require 'digest/sha2'
|
|
require 'json'
|
|
require 'fileutils'
|
|
require 'tmpdir'
|
|
require_relative '../commands/const'
|
|
require_relative '../commands/help'
|
|
require_relative '../commands/list'
|
|
require_relative '../lib/color'
|
|
require_relative '../lib/const'
|
|
require_relative '../lib/convert_size'
|
|
require_relative '../lib/deb_utils'
|
|
require_relative '../lib/docopt'
|
|
require_relative '../lib/downloader'
|
|
require_relative '../lib/gnome'
|
|
require_relative '../lib/package'
|
|
require_relative '../lib/util'
|
|
|
|
CREW_LICENSE = <<~LICENSESTRING
|
|
Copyright (C) 2013-2024 Chromebrew Authors
|
|
|
|
This program is free software: you can redistribute it and/or modify
|
|
it under the terms of the GNU General Public License as published by
|
|
the Free Software Foundation, either version 3 of the License, or
|
|
(at your option) any later version.
|
|
|
|
This program is distributed in the hope that it will be useful,
|
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
GNU General Public License for more details.
|
|
|
|
You should have received a copy of the GNU General Public License
|
|
along with this program. If not, see https://www.gnu.org/licenses/gpl-3.0.html.
|
|
|
|
Chromebrew embeds lib/docopt.rb from another project under the MIT License.
|
|
You should have received a copy of the license along with this program.
|
|
If not, see https://github.com/docopt/docopt.rb/blob/master/LICENSE
|
|
LICENSESTRING
|
|
|
|
DOC = <<~DOCOPT
|
|
Chromebrew - Package manager for Chrome OS https://chromebrew.github.io
|
|
|
|
Usage:
|
|
crew build [options] [-k|--keep] <name> ...
|
|
crew const [options] [<name> ...]
|
|
crew deps [options] [--deep] [-t|--tree] [-b|--include-build-deps] [--exclude-buildessential] <name> ...
|
|
crew download [options] [-s|--source] <name> ...
|
|
crew files [options] <name> ...
|
|
crew help [<command>] [<subcommand>]
|
|
crew install [options] [-k|--keep] [-s|--source] [-S|--recursive-build] <name> ...
|
|
crew list [options] (available|installed|compatible|incompatible)
|
|
crew postinstall [options] <name> ...
|
|
crew prop
|
|
crew reinstall [options] [-k|--keep] [-s|--source] [-S|--recursive-build] <name> ...
|
|
crew remove [options] <name> ...
|
|
crew search [options] [<name> ...]
|
|
crew sysinfo [options]
|
|
crew test [<name> ...]
|
|
crew update [options] [<compatible>]
|
|
crew upgrade [options] [-k|--keep] [-s|--source] [<name> ...]
|
|
crew upload [options] [<name> ...]
|
|
crew whatprovides [options] <pattern> ...
|
|
|
|
-b --include-build-deps Include build dependencies in output.
|
|
-t --tree Print dependencies in a tree-structure format.
|
|
-c --color Use colors even if standard out is not a tty.
|
|
-d --no-color Disable colors even if standard out is a tty.
|
|
-f --force Force where relevant.
|
|
-k --keep Keep the `CREW_BREW_DIR` (#{CREW_BREW_DIR}) directory.
|
|
-L --license Display the crew license.
|
|
-s --source Build or download from source even if pre-compiled binary exists.
|
|
-S --recursive-build Build from source, including all dependencies, even if pre-compiled binaries exist.
|
|
-v --verbose Show extra information.
|
|
-V --version Display the crew version.
|
|
-h --help Show this screen.
|
|
|
|
version #{CREW_VERSION}
|
|
DOCOPT
|
|
|
|
# All available crew commands.
|
|
@cmds = DOC.scan(/crew ([^\s]+)/).flatten
|
|
|
|
# Disallow sudo
|
|
abort 'Chromebrew should not be run as root.'.lightred if Process.uid.zero?
|
|
|
|
# Add lib to LOAD_PATH
|
|
$LOAD_PATH << File.join(CREW_LIB_PATH, 'lib')
|
|
|
|
# Parse arguments using docopt
|
|
begin
|
|
args = Docopt.docopt(DOC)
|
|
args['<name>']&.map! { |arg| arg.tr('-', '_') }
|
|
rescue Docopt::Exit => e
|
|
if ARGV[0]
|
|
case ARGV[0]
|
|
when '-V', '--version', 'version'
|
|
puts CREW_VERSION
|
|
exit 0
|
|
when '-L', '--license', 'license'
|
|
puts CREW_LICENSE
|
|
exit 0
|
|
end
|
|
unless %w[-h --help].include?(ARGV[0])
|
|
puts "Could not understand \"crew #{ARGV.join(' ')}\".".lightred
|
|
# Looking for similar commands
|
|
unless @cmds.include?(ARGV[0])
|
|
similar = @cmds.select { |word| edit_distance(ARGV[0], word) < 4 }
|
|
unless similar.empty?
|
|
abort <<~EOT
|
|
Did you mean?
|
|
#{similar.join("\n ")}
|
|
EOT
|
|
end
|
|
end
|
|
end
|
|
end
|
|
abort e.message
|
|
end
|
|
|
|
# override default color options if specified
|
|
String.use_color = args['--color'] || !args['--no-color']
|
|
|
|
@opt_force = args['--force']
|
|
@opt_keep = args['--keep']
|
|
@opt_verbose = args['--verbose']
|
|
@opt_source = args['--source']
|
|
@opt_recursive = args['--recursive-build']
|
|
|
|
# Verbose options
|
|
@fileutils_verbose = @opt_verbose
|
|
@verbose = @opt_verbose ? 'v' : ''
|
|
@short_verbose = @opt_verbose ? '-v' : ''
|
|
|
|
class ExitMessage
|
|
@messages = []
|
|
|
|
def self.add(msg, print_last: false)
|
|
# Use the print_last option to allow important messages (like sommelier) to print at the bottom
|
|
# Usage:
|
|
# ExitMessage.add 'Last Message', print_last: true
|
|
@messages << [msg.lightcyan, print_last]
|
|
end
|
|
|
|
def self.handle_messages(msg)
|
|
puts msg
|
|
# Delete printed message from array & only print each message once.
|
|
@messages.reject! {|x| x.include? msg }
|
|
end
|
|
|
|
def self.print
|
|
# print first non-print_last messages, then print_last messages.
|
|
# &:last represent to print_last
|
|
handle_messages(@messages.reject(&:last).map(&:first).first) until @messages.reject(&:last).map(&:first).empty?
|
|
handle_messages(@messages.select(&:last).map(&:first).first) until @messages.select(&:last).map(&:first).empty?
|
|
end
|
|
end
|
|
|
|
at_exit do
|
|
begin
|
|
gem 'activesupport'
|
|
rescue Gem::LoadError
|
|
puts ' -> install gem activesupport'
|
|
Gem.install('activesupport')
|
|
gem 'activesupport'
|
|
end
|
|
require 'active_support/core_ext/object/blank'
|
|
GnomePostinstall.run unless GnomePostinstall.gnome_packages.blank?
|
|
# Print exit messages.
|
|
ExitMessage.print
|
|
end
|
|
|
|
def load_json
|
|
# load_json(): (re)load device.json
|
|
json_path = File.join(CREW_CONFIG_PATH, 'device.json')
|
|
@device = JSON.load_file(json_path, symbolize_names: true)
|
|
|
|
# symbolize also values
|
|
@device.transform_values! {|val| val.is_a?(String) ? val.to_sym : val }
|
|
end
|
|
|
|
def print_package(pkg_path, extra = false)
|
|
pkg_name = File.basename pkg_path, '.rb'
|
|
begin
|
|
set_package pkg_name, pkg_path
|
|
rescue StandardError => e
|
|
warn "Error with #{pkg_name}.rb: #{e}".red unless e.to_s.include?('uninitialized constant')
|
|
end
|
|
print_current_package extra
|
|
end
|
|
|
|
def print_current_package(extra = false)
|
|
status = if @device[:installed_packages].any? { |elem| elem[:name] == @pkg.name }
|
|
:installed
|
|
elsif !@pkg.compatible?
|
|
:incompatible
|
|
else
|
|
:available
|
|
end
|
|
|
|
case status
|
|
when :installed
|
|
print @pkg.name.lightgreen
|
|
when :incompatible
|
|
print @pkg.name.lightred
|
|
when :available
|
|
print @pkg.name.lightblue
|
|
end
|
|
|
|
print ": #{@pkg.description}".lightblue if @pkg.description
|
|
if extra
|
|
puts ''
|
|
puts @pkg.homepage if @pkg.homepage
|
|
puts "Version: #{@pkg.version}"
|
|
print "License: #{@pkg.license}" if @pkg.license
|
|
end
|
|
puts ''
|
|
end
|
|
|
|
def set_package(pkg_name, pkg_path)
|
|
begin
|
|
@pkg = Package.load_package(pkg_path, pkg_name)
|
|
rescue SyntaxError => e
|
|
warn "#{e.class}: #{e.message}".red
|
|
end
|
|
|
|
@pkg.build_from_source = true if @opt_recursive
|
|
end
|
|
|
|
def list_packages
|
|
Dir["#{CREW_PACKAGES_PATH}/*.rb"].each do |filename|
|
|
print_package filename
|
|
end
|
|
end
|
|
|
|
def generate_compatible
|
|
puts 'Generating compatible packages...'.orange if @opt_verbose
|
|
@device[:compatible_packages] = []
|
|
Dir["#{CREW_PACKAGES_PATH}/*.rb"].each do |filename|
|
|
pkg_name = File.basename filename, '.rb'
|
|
begin
|
|
set_package pkg_name, filename
|
|
rescue StandardError => e
|
|
puts "Error with #{pkg_name}.rb: #{e}".red unless e.to_s.include?('uninitialized constant')
|
|
end
|
|
puts "Checking #{pkg_name} for compatibility.".orange if @opt_verbose
|
|
if @pkg.compatible?
|
|
# add to compatible packages
|
|
puts "Adding #{pkg_name} #{@pkg.version} to compatible packages.".lightgreen if @opt_verbose
|
|
@device[:compatible_packages].push(name: @pkg.name)
|
|
elsif @opt_verbose
|
|
puts "#{pkg_name} is not a compatible package.".lightred
|
|
end
|
|
end
|
|
File.open(File.join(CREW_CONFIG_PATH, 'device.json'), 'w') do |file|
|
|
output = JSON.parse @device.to_json
|
|
file.write JSON.pretty_generate(output)
|
|
end
|
|
puts 'Generating compatible packages done.'.orange if @opt_verbose
|
|
end
|
|
|
|
def search(pkg_name, silent = false)
|
|
pkg_path = File.join(CREW_PACKAGES_PATH, "#{pkg_name}.rb")
|
|
begin
|
|
return set_package(pkg_name, pkg_path) if File.file?(pkg_path)
|
|
rescue StandardError => e
|
|
puts "Error with #{pkg_name}.rb: #{e}".lightred unless e.to_s.include?('uninitialized constant')
|
|
end
|
|
unless File.file?(pkg_path) && silent
|
|
@pkg = nil
|
|
abort "Package #{pkg_name} not found. 😞".lightred unless silent
|
|
return
|
|
end
|
|
end
|
|
|
|
def regexp_search(pkg_pat)
|
|
re = Regexp.new(pkg_pat, true)
|
|
results = Dir["#{CREW_PACKAGES_PATH}/*.rb"] \
|
|
.select { |f| File.basename(f, '.rb') =~ re } \
|
|
.each { |f| print_package(f, @opt_verbose) }
|
|
if results.empty?
|
|
Dir["#{CREW_PACKAGES_PATH}/*.rb"].each do |package_path|
|
|
package_name = File.basename package_path, '.rb'
|
|
begin
|
|
set_package package_name, package_path
|
|
rescue StandardError => e
|
|
puts "Error with #{pkg_name}.rb: #{e}".red unless e.to_s.include?('uninitialized constant')
|
|
end
|
|
if @pkg.description =~ /#{pkg_pat}/i
|
|
print_current_package @opt_verbose
|
|
results.push(package_name)
|
|
end
|
|
end
|
|
end
|
|
abort "Package #{pkg_pat} not found. :(".lightred if results.empty?
|
|
end
|
|
|
|
def cache_build
|
|
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
|
if CREW_CACHE_ENABLED && File.writable?(CREW_CACHE_DIR)
|
|
puts 'Caching build dir...'
|
|
pkg_build_dirname_absolute = File.join(CREW_BREW_DIR, @extract_dir)
|
|
pkg_build_dirname = File.basename(pkg_build_dirname_absolute)
|
|
Dir.chdir pkg_build_dirname_absolute do
|
|
# Do not use --exclude-vcs w/ tar to exclude .git
|
|
# because some builds will use that information.
|
|
# Backup build cachefile it if exists.
|
|
FileUtils.mv build_cachefile, "#{build_cachefile}.bak", force: true if File.file?(build_cachefile)
|
|
FileUtils.mv "#{build_cachefile}.sha256", "#{build_cachefile}.sha256.bak", force: true if File.file?("#{build_cachefile}.sha256")
|
|
Dir.chdir(CREW_BREW_DIR) do
|
|
system "tar c#{@verbose} #{pkg_build_dirname} \
|
|
| nice -n 20 #{CREW_PREFIX}/bin/zstd -c --ultra --fast -f -o #{build_cachefile} -"
|
|
end
|
|
end
|
|
system "sha256sum #{build_cachefile} > #{build_cachefile}.sha256"
|
|
puts "Build directory cached at #{build_cachefile}".lightgreen
|
|
else
|
|
puts 'CREW_CACHE_ENABLED is not set.'.orange unless CREW_CACHE_ENABLED
|
|
puts 'CREW_CACHE_DIR is not writable.'.lightred unless File.writable?(CREW_CACHE_DIR)
|
|
end
|
|
end
|
|
|
|
def files(pkg_name)
|
|
local_filelist = File.join(CREW_META_PATH, "#{pkg_name}.filelist")
|
|
manifest_filelist = File.join(CREW_LIB_PATH, "manifest/#{ARCH}/#{pkg_name[0]}/#{pkg_name}.filelist")
|
|
|
|
if File.exist?(local_filelist)
|
|
# search for local filelist first
|
|
filelist_path = local_filelist
|
|
elsif File.exist?(manifest_filelist)
|
|
# search for manifest directory if not installed
|
|
filelist_path = manifest_filelist
|
|
else
|
|
# package does not have any filelist available
|
|
warn "Package #{pkg_name} is not installed. :(".lightred
|
|
return false
|
|
end
|
|
|
|
filelist = File.readlines(filelist_path, chomp: true).sort
|
|
lines = filelist.size
|
|
size = 0
|
|
|
|
filelist.each do |filename|
|
|
# ignore symlinks to prevent duplicating calculation
|
|
size += File.size(filename) if File.file?(filename) && !File.symlink?(filename)
|
|
end
|
|
|
|
puts filelist, <<~EOT.lightgreen
|
|
|
|
Total found: #{lines}
|
|
Disk usage: #{human_size(size)}
|
|
EOT
|
|
end
|
|
|
|
def prop(silent = false)
|
|
props = []
|
|
pkg = Package.new
|
|
excluded_methods = %w[compatible binary source json_creatable autoload include const_defined class_variable_defined singleton_class method_defined public_method_defined private_method_defined protected_method_defined instance_variable_defined instance_of kind_of is_a frozen nil eql respond_to equal]
|
|
all_methods = pkg.class.methods.grep(/\?$/).to_s.gsub(/([?:,\[\]])/, '').split
|
|
all_methods.each do |method|
|
|
props.push(method) unless excluded_methods.include?(method)
|
|
end
|
|
if silent
|
|
return props
|
|
else
|
|
puts props.sort
|
|
puts "For more information, type 'crew help prop <property>' where <property> is one of the above properties.".lightblue
|
|
end
|
|
end
|
|
|
|
def whatprovides(regex_pat)
|
|
matched_list = `grep -R "#{regex_pat}" #{CREW_LIB_PATH}/manifest/#{ARCH}`.lines(chomp: true).flat_map do |result|
|
|
filelist, matched_file = result.split(':', 2)
|
|
pkg_name = File.basename(filelist, '.filelist')
|
|
pkg_name_status = pkg_name
|
|
if @device[:compatible_packages].any? { |elem| elem[:name] == pkg_name }
|
|
pkg_name_status = pkg_name.lightgreen if File.file? "#{CREW_META_PATH}/#{pkg_name}.filelist"
|
|
else
|
|
pkg_name_status = pkg_name.lightred
|
|
end
|
|
|
|
"#{pkg_name_status}: #{matched_file}"
|
|
end.sort
|
|
|
|
puts matched_list, "\nTotal found: #{matched_list.length}".lightgreen if matched_list.any?
|
|
end
|
|
|
|
def update
|
|
abort "'crew update' is used to update crew itself. Use 'crew upgrade <package1> [<package2> ...]' to update specific packages.".orange if @pkg_name
|
|
|
|
# update package lists
|
|
Dir.chdir(CREW_LIB_PATH) do
|
|
# Set sparse-checkout folders.
|
|
system "git sparse-checkout set packages manifest/#{ARCH} lib commands bin crew tests tools"
|
|
system 'git sparse-checkout reapply'
|
|
|
|
system "git fetch #{CREW_REPO} #{CREW_BRANCH}", exception: true
|
|
system 'git reset --hard FETCH_HEAD', exception: true
|
|
system 'git-restore-mtime -sq 2>/dev/null', exception: true if File.file?("#{CREW_PREFIX}/bin/git-restore-mtime")
|
|
end
|
|
|
|
puts 'Package lists, crew, and library updated.'
|
|
|
|
# Do any fixups necessary after crew has updated from git.
|
|
load "#{CREW_LIB_PATH}/lib/fixup.rb"
|
|
|
|
# update compatible packages
|
|
generate_compatible
|
|
# check for outdated installed packages
|
|
puts 'Checking for package updates...'
|
|
|
|
can_be_updated = 0
|
|
@device[:installed_packages].each do |package|
|
|
search package[:name], true
|
|
unless @pkg
|
|
puts "Package file for #{package[:name]} not found. :(".lightred if @opt_verbose
|
|
next
|
|
end
|
|
different_version = (package[:version] != @pkg.version)
|
|
has_sha = !(@pkg.get_binary_sha256(@device[:architecture]).to_s.empty? || package[:binary_sha256].to_s.empty?)
|
|
different_sha = has_sha && package[:binary_sha256] != @pkg.get_binary_sha256(@device[:architecture])
|
|
|
|
can_be_updated += 1 if different_version || different_sha
|
|
|
|
if different_version && !different_sha && has_sha
|
|
can_be_updated -= 1
|
|
puts "#{@pkg.name} has a version change but does not have updated binaries".yellow
|
|
elsif different_version
|
|
puts "#{@pkg.name} could be updated from #{package[:version]} to #{@pkg.version}"
|
|
elsif !different_version && different_sha
|
|
puts "#{@pkg.name} could be updated (rebuild)"
|
|
end
|
|
end
|
|
|
|
if can_be_updated.positive?
|
|
puts "\n#{can_be_updated} packages can be updated."
|
|
puts 'Run `crew upgrade` to update all packages or `crew upgrade <package1> [<package2> ...]` to update specific packages.'
|
|
else
|
|
puts 'Your software is up to date.'.lightgreen
|
|
end
|
|
end
|
|
|
|
def upgrade(*pkgs, build_from_source: false)
|
|
check_update_avail = lambda do |pkg_file|
|
|
pkg_name = File.basename(pkg_file, '.rb')
|
|
|
|
unless File.file?(pkg_file)
|
|
warn "Package file for installed package #{pkg_name} is missing.".lightred
|
|
return false
|
|
end
|
|
|
|
pkgs.each do
|
|
unless @device[:installed_packages].any? { |package| package[:name] == pkg_name }
|
|
puts 'Package '.lightred + pkg_name.orange + ' is not installed. 😔 You may try this: '.lightred + "crew install #{pkg_name}".lightblue
|
|
return false
|
|
end
|
|
end
|
|
pkg_ver_latest = Package.load_package(pkg_file, pkg_name).version
|
|
pkg_ver_installed = @device[:installed_packages].select { |pkg| pkg[:name] == pkg_name } [0][:version]
|
|
pkg_hash_latest = Package.load_package(pkg_file, pkg_name).get_binary_sha256(@device[:architecture])
|
|
pkg_hash_installed = @device[:installed_packages].select { |pkg| pkg[:name] == pkg_name } [0][:binary_sha256]
|
|
|
|
return pkg_hash_latest != pkg_hash_installed unless !pkg_hash_installed || pkg_hash_latest == ''
|
|
return pkg_ver_latest != pkg_ver_installed
|
|
end
|
|
|
|
to_be_upgraded = []
|
|
|
|
if pkgs.any?
|
|
# check for specific package(s)
|
|
pkgs.each do |pkg_name|
|
|
pkg_file = File.join(CREW_PACKAGES_PATH, "#{pkg_name}.rb")
|
|
to_be_upgraded << pkg_name if check_update_avail.call(pkg_file)
|
|
end
|
|
else
|
|
# check for all packages if no package name provided
|
|
@device[:installed_packages].each do |pkg|
|
|
pkg_file = File.join(CREW_PACKAGES_PATH, "#{pkg[:name]}.rb")
|
|
to_be_upgraded << pkg[:name] if check_update_avail.call(pkg_file)
|
|
end
|
|
end
|
|
|
|
if to_be_upgraded.empty?
|
|
puts 'Your software is already up to date.'.lightgreen
|
|
return true
|
|
end
|
|
|
|
# Eventually, we should have the upgrade order generated based upon an
|
|
# analysis of the dependency hierarchy, to make sure that earlier
|
|
# dependencies get upgraded first.
|
|
|
|
# Upgrade OpenSSL first if OpenSSL is in the upgrade list, as other
|
|
# package upgrades, especially their postinstalls, may break until the
|
|
# new version of OpenSSL is installed.
|
|
to_be_upgraded.insert(0, to_be_upgraded.delete('openssl')) if to_be_upgraded.include?('openssl')
|
|
|
|
# Only upgrade ruby if ruby is in the upgrade list, as other
|
|
# package upgrades may break until crew is rerun with the new
|
|
# version of ruby.
|
|
if to_be_upgraded.include?('ruby')
|
|
to_be_upgraded = ['ruby']
|
|
rerun_upgrade = true
|
|
end
|
|
|
|
# install new dependencies (if any)
|
|
to_be_upgraded.each do |pkg_name|
|
|
search(pkg_name)
|
|
resolve_dependencies
|
|
end
|
|
|
|
puts 'Updating packages...'
|
|
|
|
# upgrade packages
|
|
to_be_upgraded.each do |pkg_name|
|
|
search(pkg_name)
|
|
print_current_package
|
|
@pkg.build_from_source = (build_from_source || CREW_BUILD_FROM_SOURCE)
|
|
|
|
puts "Updating #{@pkg.name}..." if @opt_verbose
|
|
|
|
@pkg.in_upgrade = true
|
|
resolve_dependencies_and_install
|
|
end
|
|
|
|
puts 'Packages have been updated.'.lightgreen unless rerun_upgrade
|
|
puts "Ruby was upgraded. Please run 'crew upgrade' again to make sure upgrades are complete.".lightblue if rerun_upgrade
|
|
end
|
|
|
|
def download
|
|
url = @pkg.get_url(@device[:architecture])
|
|
source = @pkg.source?(@device[:architecture])
|
|
|
|
uri = URI.parse url
|
|
filename = File.basename(uri.path)
|
|
sha256sum = @pkg.get_sha256(@device[:architecture])
|
|
@extract_dir = @pkg.get_extract_dir
|
|
|
|
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
|
return { source:, filename: } if CREW_CACHE_BUILD && File.file?(build_cachefile)
|
|
|
|
if !url
|
|
abort "No precompiled binary or source is available for #{@device[:architecture]}.".lightred
|
|
elsif @pkg.build_from_source
|
|
puts 'Downloading source...'
|
|
elsif !source
|
|
puts 'Precompiled binary available, downloading...'
|
|
elsif url.casecmp?('SKIP')
|
|
puts 'Skipping source download...'
|
|
else
|
|
puts 'No precompiled binary available for your platform, downloading source...'
|
|
end
|
|
|
|
git = uri.scheme.eql?('git')
|
|
|
|
Dir.chdir CREW_BREW_DIR do
|
|
case File.basename(filename)
|
|
# Sources that download with our internal downloader
|
|
when /\.zip$/i, /\.(tar(\.(gz|bz2|xz|lzma|lz|zst))?|tgz|tbz|tpxz|txz)$/i, /\.deb$/i, /\.AppImage$/i
|
|
# Recall file from cache if requested
|
|
if CREW_CACHE_ENABLED
|
|
puts "Looking for #{@pkg.name} archive in cache".orange if @opt_verbose
|
|
# Privilege CREW_LOCAL_BUILD_DIR over CREW_CACHE_DIR.
|
|
local_build_cachefile = File.join(CREW_LOCAL_BUILD_DIR, filename)
|
|
crew_cache_dir_cachefile = File.join(CREW_CACHE_DIR, filename)
|
|
cachefile = File.file?(local_build_cachefile) ? local_build_cachefile : crew_cache_dir_cachefile
|
|
puts "Using #{@pkg.name} archive from the build cache at #{cachefile}; The checksum will not be checked against the package file.".orange if cachefile.include?(CREW_LOCAL_BUILD_DIR)
|
|
if File.file?(cachefile)
|
|
puts "#{@pkg.name.capitalize} archive file exists in cache".lightgreen if @opt_verbose
|
|
# Don't check checksum if file is in the build cache.
|
|
if Digest::SHA256.hexdigest(File.read(cachefile)) == sha256sum || sha256sum =~ /^SKIP$/i || cachefile.include?(CREW_LOCAL_BUILD_DIR)
|
|
begin
|
|
# Hard link cached file if possible.
|
|
FileUtils.ln cachefile, CREW_BREW_DIR, force: true, verbose: @fileutils_verbose unless File.identical?(cachefile, "#{CREW_BREW_DIR}/#{filename}")
|
|
puts 'Archive hard linked from cache'.green if @opt_verbose
|
|
rescue StandardError
|
|
# Copy cached file if hard link fails.
|
|
FileUtils.cp cachefile, CREW_BREW_DIR, verbose: @fileutils_verbose unless File.identical?(cachefile, "#{CREW_BREW_DIR}/#{filename}")
|
|
puts 'Archive copied from cache'.green if @opt_verbose
|
|
end
|
|
puts 'Archive found in cache'.lightgreen
|
|
return { source:, filename: }
|
|
else
|
|
puts 'Cached archive checksum mismatch. 😔 Will download.'.lightred
|
|
cachefile = ''
|
|
end
|
|
else
|
|
puts 'Cannot find cached archive. 😔 Will download.'.lightred
|
|
cachefile = ''
|
|
end
|
|
end
|
|
# Download file if not cached.
|
|
downloader url, sha256sum, filename, @opt_verbose
|
|
|
|
puts "#{@pkg.name.capitalize} archive downloaded.".lightgreen
|
|
# Stow file in cache if requested, if file is not from cache,
|
|
# and cache is writable.
|
|
if CREW_CACHE_ENABLED && cachefile.to_s.empty? && File.writable?(CREW_CACHE_DIR)
|
|
begin
|
|
# Hard link to cache if possible.
|
|
FileUtils.ln filename, CREW_CACHE_DIR, verbose: @fileutils_verbose
|
|
puts 'Archive hard linked to cache'.green if @opt_verbose
|
|
rescue StandardError
|
|
# Copy to cache if hard link fails.
|
|
FileUtils.cp filename, CREW_CACHE_DIR, verbose: @fileutils_verbose
|
|
puts 'Archive copied to cache'.green if @opt_verbose
|
|
end
|
|
end
|
|
return { source:, filename: }
|
|
|
|
when /^SKIP$/i
|
|
Dir.mkdir @extract_dir
|
|
when /\.git$/i # Source URLs which end with .git are git sources.
|
|
git = true
|
|
else
|
|
Dir.mkdir @extract_dir
|
|
downloader url, sha256sum, filename, @opt_verbose
|
|
|
|
puts "#{filename}: File downloaded.".lightgreen
|
|
|
|
FileUtils.mv filename, "#{@extract_dir}/#{filename}"
|
|
end
|
|
|
|
# Handle git sources.
|
|
if git
|
|
# Recall repository from cache if requested
|
|
if CREW_CACHE_ENABLED
|
|
# No git branch specified, just a git commit or tag
|
|
if @pkg.git_branch.to_s.empty?
|
|
abort('No Git branch, commit, or tag specified!').lightred if @pkg.git_hashtag.to_s.empty?
|
|
cachefile = File.join(CREW_CACHE_DIR, "#{filename}#{@pkg.git_hashtag.gsub('/', '_')}.tar.zst")
|
|
# Git branch and git commit specified
|
|
elsif !@pkg.git_hashtag.to_s.empty?
|
|
cachefile = File.join(CREW_CACHE_DIR, "#{filename}#{@pkg.git_branch.gsub(/[^0-9A-Za-z.-]/, '_')}_#{@pkg.git_hashtag.gsub('/', '_')}.tar.zst")
|
|
# Git branch specified, without a specific git commit.
|
|
else
|
|
# Use to the day granularity for a branch timestamp with no specific commit specified.
|
|
cachefile = File.join(CREW_CACHE_DIR, "#{filename}#{@pkg.git_branch.gsub(/[^0-9A-Za-z.-]/, '_')}#{Time.now.strftime('%m%d%Y')}.tar.zst")
|
|
end
|
|
puts "Git cachefile is #{cachefile}".orange if @opt_verbose
|
|
if File.file?(cachefile) && File.file?("#{cachefile}.sha256")
|
|
if Dir.chdir CREW_CACHE_DIR do
|
|
system "sha256sum -c #{cachefile}.sha256"
|
|
end
|
|
FileUtils.mkdir_p @extract_dir
|
|
system "tar -Izstd -x#{@verbose}f #{cachefile} -C #{@extract_dir}"
|
|
return { source:, filename: }
|
|
else
|
|
puts 'Cached git repository checksum mismatch. 😔 Will download.'.lightred
|
|
end
|
|
else
|
|
puts 'Cannot find cached git repository. 😔 Will download.'.lightred
|
|
end
|
|
end
|
|
# Download via git
|
|
Dir.mkdir @extract_dir
|
|
Dir.chdir @extract_dir do
|
|
if @pkg.git_branch.to_s.empty?
|
|
system 'git init'
|
|
system 'git config advice.detachedHead false'
|
|
system 'git config init.defaultBranch master'
|
|
system "git remote add origin #{@pkg.source_url}", exception: true
|
|
system "git fetch --depth 1 origin #{@pkg.git_hashtag}", exception: true
|
|
system 'git checkout FETCH_HEAD'
|
|
else
|
|
# Leave a message because this step can be slow.
|
|
puts 'Downloading src from a git branch. This may take a while...'
|
|
system "git clone --branch #{@pkg.git_branch} --single-branch #{@pkg.source_url} tmpdir", exception: true
|
|
system 'mv tmpdir/.git . && rm -rf tmpdir'
|
|
system "git reset --hard #{@pkg.git_hashtag}", exception: true
|
|
end
|
|
system 'git submodule update --init --recursive' unless @pkg.no_git_submodules?
|
|
system 'git fetch --tags', exception: true if @pkg.git_fetchtags?
|
|
system "git fetch origin #{@pkg.git_hashtag}", exception: true if @pkg.git_clone_deep?
|
|
puts 'Repository downloaded.'.lightgreen
|
|
end
|
|
# Stow file in cache if requested and cache is writable.
|
|
if CREW_CACHE_ENABLED && File.writable?(CREW_CACHE_DIR)
|
|
puts 'Caching downloaded git repo...'
|
|
Dir.chdir @extract_dir do
|
|
# Do not use --exclude-vcs to exclude .git
|
|
# because some builds will use that information.
|
|
system "tar c#{@verbose} \
|
|
$(find -mindepth 1 -maxdepth 1 -printf '%P\n') | \
|
|
nice -n 20 #{CREW_PREFIX}/bin/zstd -c -T0 --ultra -20 - > \
|
|
#{cachefile}"
|
|
end
|
|
system 'sha256sum', cachefile, out: "#{cachefile}.sha256"
|
|
puts 'Git repo cached.'.lightgreen
|
|
end
|
|
end
|
|
end
|
|
return { source:, filename: }
|
|
end
|
|
|
|
def unpack(meta)
|
|
target_dir = nil
|
|
Dir.chdir CREW_BREW_DIR do
|
|
FileUtils.mkdir_p @extract_dir, verbose: @fileutils_verbose
|
|
|
|
build_cachefile = File.join(CREW_CACHE_DIR, "#{@pkg.name}-#{@pkg.version}-build-#{@device[:architecture]}.tar.zst")
|
|
if CREW_CACHE_BUILD && File.file?(build_cachefile) && File.file?("#{build_cachefile}.sha256") && ( system "cd #{CREW_CACHE_DIR} && sha256sum -c #{build_cachefile}.sha256" )
|
|
@pkg.cached_build = true
|
|
puts "Extracting cached build directory from #{build_cachefile}".lightgreen
|
|
system "tar -Izstd -x#{@verbose}f #{build_cachefile} -C #{CREW_BREW_DIR}", exception: true
|
|
# Need to reset @extract_dir to the extracted cached build
|
|
# directory.
|
|
@extract_dir = `tar -Izstd --exclude='./*/*' -tf #{build_cachefile} | cut -d '/' -f 1 | LC_ALL=C sort -u`.chomp
|
|
else
|
|
@pkg.cached_build = false
|
|
case File.basename meta[:filename]
|
|
when /\.zip$/i
|
|
puts "Unpacking archive using 'unzip', this may take a while..."
|
|
system 'unzip', (@opt_verbose ? '-v' : '-qq'), '-d', @extract_dir, meta[:filename], exception: true
|
|
when /\.(tar(\.(gz|bz2|xz|lzma|lz))?|tgz|tbz|txz|tpxz)$/i
|
|
puts "Unpacking archive using 'tar', this may take a while..."
|
|
system 'tar', "-x#{@verbose}f", meta[:filename], '-C', @extract_dir, exception: true
|
|
when /\.tar\.zst$/i
|
|
puts "Unpacking archive using 'tar', this may take a while..."
|
|
system 'tar', '-Izstd', "-x#{@verbose}f", meta[:filename], '-C', @extract_dir, exception: true
|
|
when /\.deb$/i
|
|
puts "Unpacking '.deb' archive, this may take a while..."
|
|
DebUtils.extract_deb(meta[:filename], /data\..*/)
|
|
system 'tar', "-x#{@verbose}f", *Dir['data.*'], '-C', @extract_dir, exception: true
|
|
when /\.AppImage$/i
|
|
puts "Unpacking 'AppImage' archive, this may take a while..."
|
|
FileUtils.chmod 0o755, meta[:filename], verbose: @fileutils_verbose
|
|
system "../#{meta[:filename]}", '--appimage-extract', chdir: @extract_dir, exception: true
|
|
end
|
|
end
|
|
if meta[:source]
|
|
# Check the number of directories in the archive
|
|
entries = Dir["#{@extract_dir}/*"]
|
|
|
|
if entries.empty? && @opt_verbose
|
|
# This will happen with SKIP packages.
|
|
puts "Empty archive: #{meta[:filename]}".orange
|
|
end
|
|
target_dir = if entries.length == 1 && File.directory?(entries.first)
|
|
# Use `extract_dir/dir_in_archive` if there is only one directory.
|
|
entries.first
|
|
else
|
|
# Use `extract_dir` otherwise
|
|
@extract_dir
|
|
end
|
|
else
|
|
# Use `extract_dir` for binary distribution
|
|
target_dir = @extract_dir
|
|
end
|
|
# Remove tarball to save space.
|
|
FileUtils.rm_f meta[:filename], verbose: @fileutils_verbose
|
|
end
|
|
return File.join(CREW_BREW_DIR, target_dir)
|
|
end
|
|
|
|
def build_and_preconfigure(target_dir)
|
|
Dir.chdir target_dir do
|
|
unless @pkg.no_compile_needed?
|
|
puts 'Building from source, this may take a while...'
|
|
|
|
# Load musl options only if package is targeted at the musl toolchain
|
|
load File.join(CREW_LIB_PATH, 'lib/musl.rb').to_s if @pkg.is_musl?
|
|
end
|
|
|
|
@pkg.in_build = true
|
|
unless @pkg.cached_build
|
|
@pkg.patch
|
|
@pkg.prebuild
|
|
end
|
|
|
|
begin
|
|
@pkg.build
|
|
rescue StandardError
|
|
if CREW_CACHE_FAILED_BUILD
|
|
cache_build
|
|
abort 'There was a build error, caching build directory.'.lightred
|
|
end
|
|
abort 'There was a build error.'.lightred
|
|
end
|
|
@pkg.in_build = false
|
|
# wipe crew destdir
|
|
FileUtils.rm_rf Dir["#{CREW_DEST_DIR}/*"], verbose: @fileutils_verbose
|
|
puts 'Preconfiguring package...'
|
|
cache_build if CREW_CACHE_BUILD
|
|
@pkg.install
|
|
end
|
|
end
|
|
|
|
def pre_flight
|
|
puts 'Performing pre-flight checks...'
|
|
if defined?(@pkg.min_glibc) && @pkg.min_glibc && (Gem::Version.new(LIBC_VERSION.to_s) < Gem::Version.new(@pkg.min_glibc))
|
|
puts "\n#{@pkg.name.capitalize} requires glibc #{@pkg.min_glibc} and above.".lightred
|
|
abort "ChromeOS is currently running glibc #{LIBC_VERSION}.\n".lightred
|
|
end
|
|
@pkg.preflight
|
|
end
|
|
|
|
def pre_install(dest_dir)
|
|
Dir.chdir dest_dir do
|
|
puts 'Performing pre-install...'
|
|
@pkg.preinstall
|
|
# Reload device.json in case preinstall modified it via
|
|
# running 'crew remove packages...'
|
|
load_json
|
|
end
|
|
end
|
|
|
|
def post_install
|
|
if @pkg.print_source_bashrc? || @pkg.gnome?
|
|
ExitMessage.add <<~PRINT_SOURCE_BASHRC_EOT.lightblue, print_last: true
|
|
|
|
To finish the installation, please execute the following:
|
|
source ~/.bashrc
|
|
PRINT_SOURCE_BASHRC_EOT
|
|
end
|
|
|
|
GnomePostinstall.add @pkg.name if @pkg.gnome?
|
|
|
|
# return unless the postinstall function was defined by the package recipe
|
|
return unless @pkg.method(:postinstall).source_location[0].include?("#{@pkg.name}.rb")
|
|
|
|
Dir.mktmpdir do |post_install_tempdir|
|
|
Dir.chdir post_install_tempdir do
|
|
puts "Performing post-install for #{@pkg.name}...".lightblue
|
|
@pkg.postinstall
|
|
end
|
|
end
|
|
end
|
|
|
|
def compress_doc(dir)
|
|
# check whether crew should compress
|
|
return if CREW_NOT_COMPRESS || @pkg.no_compress? || !File.file?("#{CREW_PREFIX}/bin/compressdoc")
|
|
|
|
if Dir.exist? dir
|
|
system "find #{dir} -type f ! -perm -200 | xargs -r chmod u+w"
|
|
system "compressdoc --zstd #{@short_verbose} #{dir}"
|
|
end
|
|
end
|
|
|
|
def determine_conflicts(dir, pkg)
|
|
conflicts = []
|
|
if File.file?("#{dir}/filelist")
|
|
if File.file?(File.join(CREW_META_PATH, "#{pkg}.filelist"))
|
|
puts 'Checking for conflicts with files from installed packages...'.orange
|
|
conflictscmd = `grep --exclude=#{File.join(CREW_META_PATH, "#{pkg}.filelist")} --exclude=#{CREW_META_PATH}/\\\*_build.filelist -Fxf #{dir}/filelist #{CREW_META_PATH}/*.filelist`
|
|
conflicts = conflictscmd.gsub(/(\.filelist|#{CREW_META_PATH})/, '').split("\n")
|
|
conflicts.reject!(&:empty?)
|
|
end
|
|
elsif File.file?(File.join(CREW_META_PATH, "#{pkg}.filelist"))
|
|
puts "Checking for conflicts of #{pkg} with files from installed packages...".orange
|
|
conflictscmd = `grep --exclude=#{File.join(CREW_META_PATH, "#{pkg}.filelist")} --exclude=#{CREW_META_PATH}/\\\*_build.filelist -Fxf #{File.join(CREW_META_PATH, "#{pkg}.filelist")} #{CREW_META_PATH}/*.filelist`
|
|
conflicts = conflictscmd.gsub(/(\.filelist|#{CREW_META_PATH})/, '').split("\n")
|
|
conflicts.reject!(&:empty?)
|
|
end
|
|
if conflicts.any?
|
|
puts 'There is a conflict with the same file in another package:'.orange
|
|
puts conflicts.to_s.orange
|
|
end
|
|
conflicts.map! { |x| x.to_s.partition(':').last}
|
|
return conflicts
|
|
end
|
|
|
|
def prepare_package(destdir)
|
|
# Create the destdir if it does not exist to avoid having to have
|
|
# this single line in no_compile_needed packages.
|
|
FileUtils.mkdir_p CREW_DEST_PREFIX
|
|
Dir.chdir destdir do
|
|
# Avoid /usr/local/share/info/dir{.gz} file conflict:
|
|
# The install-info program maintains a directory of installed
|
|
# info documents in /usr/share/info/dir for the use of info
|
|
# readers. This file must not be included in packages other
|
|
# than install-info.
|
|
# https://www.debian.org/doc/debian-policy/ch-docs.html#info-documents
|
|
FileUtils.rm_f "#{CREW_DEST_PREFIX}/share/info/dir"
|
|
|
|
# Remove all perl module files which will conflict
|
|
if @pkg.name =~ /^perl_/
|
|
puts 'Removing .packlist and perllocal.pod files to avoid conflicts with other perl packages.'.orange
|
|
system "find #{CREW_DEST_DIR} -type f \\( -name '.packlist' -o -name perllocal.pod \\) -delete"
|
|
end
|
|
|
|
# Compress manual files, and move errant files to the correct
|
|
# locations.
|
|
if File.exist?("#{CREW_DEST_PREFIX}/man")
|
|
puts "Files in #{CREW_PREFIX}/man will be moved to #{CREW_MAN_PREFIX}.".orange
|
|
FileUtils.mkdir_p CREW_DEST_MAN_PREFIX
|
|
FileUtils.mv Dir["#{CREW_DEST_PREFIX}/man/*"], "#{CREW_DEST_MAN_PREFIX}/"
|
|
Dir.rmdir "#{CREW_DEST_PREFIX}/man" if Dir.empty?("#{CREW_DEST_PREFIX}/man")
|
|
end
|
|
if File.exist?("#{CREW_DEST_PREFIX}/info")
|
|
puts "Files in #{CREW_PREFIX}/info will be moved to #{CREW_PREFIX}/share/info.".orange
|
|
FileUtils.mkdir_p "#{CREW_DEST_PREFIX}/share/info/"
|
|
FileUtils.mv Dir["#{CREW_DEST_PREFIX}/info/*"], "#{CREW_DEST_PREFIX}/share/info/"
|
|
Dir.rmdir "#{CREW_DEST_PREFIX}/info" if Dir.empty?("#{CREW_DEST_PREFIX}/info")
|
|
end
|
|
# Remove the "share/info/dir.*" file since it causes conflicts.
|
|
FileUtils.rm_f Dir["#{CREW_DEST_PREFIX}/share/info/dir*"]
|
|
compress_doc CREW_DEST_MAN_PREFIX
|
|
compress_doc "#{CREW_DEST_PREFIX}/share/info"
|
|
|
|
# Allow postbuild to override the filelist contents
|
|
@pkg.postbuild
|
|
|
|
# create file list
|
|
system "find .#{CREW_PREFIX} -type f,l | cut -c2- | LC_ALL=C sort", out: %w[filelist a] if Dir.exist?(CREW_DEST_PREFIX)
|
|
system "find .#{HOME} -type f,l | cut -c2- | LC_ALL=C sort", out: %w[filelist a] if Dir.exist?(CREW_DEST_HOME)
|
|
|
|
if Dir.exist?("#{CREW_LOCAL_REPO_ROOT}/manifest") && File.writable?("#{CREW_LOCAL_REPO_ROOT}/manifest")
|
|
FileUtils.mkdir_p "#{CREW_LOCAL_REPO_ROOT}/manifest/#{ARCH}/#{@pkg.name.chr.downcase}"
|
|
FileUtils.cp 'filelist', "#{CREW_LOCAL_REPO_ROOT}/manifest/#{ARCH}/#{@pkg.name.chr.downcase}/#{@pkg.name}.filelist"
|
|
end
|
|
|
|
# check for FHS3 compliance
|
|
puts 'Checking for FHS3 compliance...'
|
|
errors = false
|
|
fhs_compliant_prefix = %W[bin etc include lib #{ARCH_LIB} libexec opt sbin share var].uniq
|
|
|
|
Dir.foreach(CREW_DEST_PREFIX) do |filename|
|
|
next if %w[. ..].include?(filename)
|
|
|
|
unless fhs_compliant_prefix.include?(filename)
|
|
if CREW_FHS_NONCOMPLIANCE_ONLY_ADVISORY || @pkg.no_fhs?
|
|
puts "Warning: #{CREW_PREFIX}/#{filename} in #{@pkg.name} is not FHS3 compliant.".orange
|
|
else
|
|
puts "Error: #{CREW_PREFIX}/#{filename} in #{@pkg.name} is not FHS3 compliant.".lightred
|
|
errors = true
|
|
end
|
|
end
|
|
end
|
|
|
|
# check for conflicts with other installed files
|
|
conflicts = determine_conflicts(Dir.pwd, @pkg.name)
|
|
if conflicts.any?
|
|
if CREW_CONFLICTS_ONLY_ADVISORY || @pkg.conflicts_ok?
|
|
puts 'Warning: There is a conflict with the same file in another package.'.orange
|
|
else
|
|
puts 'Error: There is a conflict with the same file in another package.'.lightred
|
|
errors = true
|
|
end
|
|
puts conflicts
|
|
end
|
|
|
|
# abort if errors encountered
|
|
abort 'Exiting due to above errors.'.lightred if errors
|
|
|
|
# Make sure the package file has runtime dependencies added properly.
|
|
system "#{CREW_LIB_PATH}/tools/getrealdeps.rb --use-crew-dest-dir #{@pkg.name}", exception: true unless @pkg.no_compile_needed?
|
|
# create directory list
|
|
# Remove CREW_PREFIX and HOME from the generated directorylist.
|
|
crew_prefix_escaped = CREW_PREFIX.gsub('/', '\/')
|
|
home_escaped = HOME.gsub('/', '\/')
|
|
system "find .#{CREW_PREFIX} -type d | cut -c2- | sed '0,/#{crew_prefix_escaped}/{/#{crew_prefix_escaped}/d}'| LC_ALL=C sort", out: %w[dlist a] if Dir.exist?(CREW_DEST_PREFIX)
|
|
system "find .#{HOME} -type d | cut -c2- | sed '0,/#{home_escaped}/{/#{home_escaped}/d}' | LC_ALL=C sort", out: %w[dlist a] if Dir.exist?(CREW_DEST_HOME)
|
|
|
|
strip_dir destdir
|
|
|
|
# Patchelf currently disabled for security reasons
|
|
# See https://github.com/upx/upx/issues/655#issuecomment-1457434081
|
|
# Use patchelf to set need paths for all binaries.
|
|
# patchelf_set_need_paths destdir
|
|
|
|
# use upx on executables
|
|
shrink_dir destdir
|
|
end
|
|
end
|
|
|
|
def patchelf_set_need_paths(dir)
|
|
return if @pkg.no_patchelf? || @pkg.no_compile_needed?
|
|
|
|
puts 'Patchelf is currently disabled during builds due to problems with upx.'.yellow
|
|
return
|
|
|
|
# Disable unreachable code check, as this is a temporary situation
|
|
# rubocop:disable Lint/UnreachableCode
|
|
Dir.chdir dir do
|
|
puts 'Running patchelf'.lightblue
|
|
abort('No Patchelf found!').lightred unless File.file?("#{CREW_PREFIX}/bin/patchelf")
|
|
execfiles = `find . -executable -type f ! \\( -name '*.a' \\) | xargs -P#{CREW_NPROC} -n1 sh -c '[ "$(head -c4 ${1})" = "\x7FELF" ] && echo ${1}' --`.chomp
|
|
return if execfiles.empty?
|
|
|
|
patchelf_lib_prefix = @pkg.is_musl? ? "#{CREW_MUSL_PREFIX}/lib" : CREW_LIB_PREFIX
|
|
puts "patchelf_lib_prefix is #{patchelf_lib_prefix}" if @opt_verbose
|
|
patchelf_interpreter = @pkg.is_musl? ? "#{CREW_MUSL_PREFIX}/lib/libc.so" : 'CREW_LIB_PREFIX/libc.so.6'
|
|
puts "patchelf_interpreter is #{patchelf_interpreter}" if @opt_verbose
|
|
|
|
puts 'Running patchelf to patch binaries for library paths'.lightblue
|
|
execfiles.each_line(chomp: true) do |execfiletopatch|
|
|
execfiletopatch = Dir.pwd + execfiletopatch.delete_prefix('.')
|
|
neededlibs = `patchelf --print-needed #{execfiletopatch}`
|
|
next if neededlibs.to_s.empty?
|
|
|
|
neededlibs.each_line(chomp: true) do |neededlibspatch|
|
|
next if neededlibspatch.include?(patchelf_lib_prefix.to_s)
|
|
|
|
# Avoid segfaults from not using system versions of these files.
|
|
patchelf_veto_files = %w[
|
|
libdl.so
|
|
ld-linux.so.2
|
|
ld-linux-x86-64.so.2
|
|
ld-linux-armhf.so.3
|
|
libc.so.6
|
|
]
|
|
next if !@pkg.is_musl? && patchelf_veto_files.any? { |i| neededlibspatch.include? i }
|
|
|
|
neededlib_basename = File.basename(neededlibspatch)
|
|
neededlibspatchednamepath = "#{patchelf_lib_prefix}/" + neededlib_basename
|
|
# The first check here can be changed to just check the dest_dir
|
|
# hierarchy for neededlib_basename if the intent is to allow
|
|
# using a different CREW_PREFIX during package installs.
|
|
if File.file?(neededlibspatchednamepath) || File.file?(Dir.pwd + neededlibspatchednamepath)
|
|
puts "patchelf --replace-needed #{neededlibspatch} #{neededlibspatchednamepath} #{execfiletopatch}" if @opt_verbose
|
|
system "patchelf --replace-needed #{neededlibspatch} #{neededlibspatchednamepath} #{execfiletopatch}"
|
|
else
|
|
puts "#{execfiletopatch} needed library #{neededlib_basename} not found in #{patchelf_lib_prefix} or #{Dir.pwd + neededlibspatchednamepath}.".lightred
|
|
end
|
|
end
|
|
# Do not set interpreter for non-musl, as this can break apps if there
|
|
# is an issue with the crew glibc.
|
|
next unless @pkg.is_musl?
|
|
|
|
puts 'Running patchelf to patch binary interpreter paths'.lightblue
|
|
system "patchelf --set-interpreter #{patchelf_interpreter} #{execfiletopatch}"
|
|
end
|
|
end
|
|
# rubocop:enable Lint/UnreachableCode
|
|
end
|
|
|
|
def strip_find_files(find_cmd, strip_option = '')
|
|
# Check whether crew should strip.
|
|
return if CREW_NOT_STRIP || @pkg.no_strip? || !File.file?("#{CREW_PREFIX}/bin/llvm-strip")
|
|
|
|
# Run find_cmd and strip only files with ar or elf magic headers.
|
|
system "#{find_cmd} | xargs -r chmod u+w"
|
|
strip_verbose = @opt_verbose ? 'echo "Stripping ${0:1}" &&' : ''
|
|
# The craziness here is from having to escape the special characters
|
|
# in the magic headers for these files.
|
|
system "#{find_cmd} | xargs -P#{CREW_NPROC} -n1 -r bash -c 'header=$(head -c4 ${0}); elfheader='$(printf '\\\177ELF')' ; arheader=\\!\\<ar ; case $header in $elfheader|$arheader) #{strip_verbose} llvm-strip #{strip_option} ${0} ;; esac'"
|
|
end
|
|
|
|
def strip_dir(dir)
|
|
unless CREW_NOT_STRIP || @pkg.no_strip? || @pkg.no_compile_needed?
|
|
Dir.chdir dir do
|
|
# Strip libraries with -S
|
|
puts 'Stripping libraries...'
|
|
strip_find_files "find . -type f \\( -name 'lib*.a' -o -name 'lib*.so*' \\) -print", '-S'
|
|
|
|
# Strip binaries but not compressed archives
|
|
puts 'Stripping binaries...'
|
|
extensions = %w[bz2 gz lha lz lzh rar tar tbz tgz tpxz txz xz Z zip zst]
|
|
inames = extensions.join(' -o -iname *.')
|
|
strip_find_files "find . -type f ! \\( -iname *.#{inames} \\) ! \\( -name 'lib*.a' -o -name 'lib*.so' \\) -perm /111 -print"
|
|
end
|
|
end
|
|
end
|
|
|
|
def shrink_dir(dir)
|
|
unless CREW_NOT_SHRINK_ARCHIVE || @pkg.no_shrink?
|
|
Dir.chdir dir do
|
|
if File.file?("#{CREW_PREFIX}/bin/rdfind")
|
|
puts 'Using rdfind to convert duplicate files to hard links.'
|
|
system "#{CREW_PREFIX}/bin/rdfind -removeidentinode true -makesymlinks false -makehardlinks true -makeresultsfile false ."
|
|
end
|
|
# Issues with non-x86_64 in compressing libraries, so just compress
|
|
# non-libraries. Also note that one needs to use "upx -d" on a
|
|
# compressed file to use ldd.
|
|
# sommelier also isn't happy when sommelier and xwayland are compressed
|
|
# so don't compress those packages.
|
|
if File.executable?("#{CREW_PREFIX}/bin/upx")
|
|
# 1. Find executable binaries but also check for hard linked
|
|
# files by making sure we have a unique set of
|
|
# inodes for the binaries found.
|
|
# 2. Copy to a temp file.
|
|
# 3. Compress using upx. (Uncompressble files are ignored.)
|
|
# 4. Check compression by expanding the compressed file with
|
|
# upx.
|
|
# 5. If the expansion doesn't error out then it is ok to copy
|
|
# over the original. (This also lets us only avoid compressing
|
|
# hard linked files multiple times.)
|
|
execfiles = `find . -executable -type f ! \\( -name '*.so*' -o -name '*.a' \\) | xargs -P8 -n1 sh -c '[ "$(head -c4 ${1})" = "\x7FELF" ] && echo ${1}' --`.chomp
|
|
|
|
unless execfiles.empty?
|
|
puts 'Using upx to shrink binaries.'
|
|
# Copying in the ThreadPoolExecutor loop fails non-deterministically
|
|
execfiles.each_line(chomp: true) do |execfilecp|
|
|
execfilecp.slice! '.'
|
|
next if execfilecp.empty?
|
|
|
|
execfilecp = File.join(dir, execfilecp)
|
|
next unless File.file?(execfilecp)
|
|
|
|
FileUtils.cp execfilecp, "#{execfilecp}-crewupxtmp"
|
|
end
|
|
begin
|
|
gem 'concurrent-ruby'
|
|
rescue Gem::LoadError
|
|
puts ' -> install gem concurrent-ruby'
|
|
Gem.install('concurrent-ruby')
|
|
gem 'concurrent-ruby'
|
|
end
|
|
require 'concurrent'
|
|
pool = Concurrent::ThreadPoolExecutor.new(
|
|
min_threads: 1,
|
|
max_threads: CREW_NPROC,
|
|
max_queue: 0, # unbounded work queue
|
|
fallback_policy: :caller_runs
|
|
)
|
|
execfiles.each_line(chomp: true) do |execfile|
|
|
pool.post do
|
|
execfile.slice! '.'
|
|
execfile = File.join(dir, execfile)
|
|
puts "Attempting to compress #{execfile} ...".orange
|
|
# Make tmp file for compression
|
|
unless system "upx --lzma #{execfile}-crewupxtmp"
|
|
puts "Compression of #{execfile} failed...".orange if @opt_verbose
|
|
FileUtils.rm_f "#{execfile}-crewupxtmp"
|
|
end
|
|
if File.file?("#{execfile}-crewupxtmp")
|
|
puts "Testing compressed #{execfile}...".lightblue if @opt_verbose
|
|
if system 'upx', '-t', "#{execfile}-crewupxtmp"
|
|
puts "#{execfile} successfully compressed...".lightgreen
|
|
FileUtils.cp "#{execfile}-crewupxtmp", execfile
|
|
end
|
|
end
|
|
FileUtils.rm_f "#{execfile}-crewupxtmp"
|
|
end
|
|
end
|
|
pool.shutdown
|
|
pool.wait_for_termination
|
|
# Make sure temporary compression copies are deleted.
|
|
system 'find . -executable -type f -name "*-crewupxtmp" -delete'
|
|
end
|
|
end
|
|
end
|
|
end
|
|
end
|
|
|
|
def install_files(src, dst = File.join( CREW_PREFIX, src.delete_prefix('./usr/local') ))
|
|
if Dir.exist?(src)
|
|
if File.executable?("#{CREW_PREFIX}/bin/crew-mvdir") && !CREW_DISABLE_MVDIR
|
|
system "crew-mvdir #{@short_verbose} #{src} #{dst}", exception: true
|
|
else
|
|
warn 'crew-mvdir is not installed. Please install it with \'crew install crew_mvdir\' for improved installation performance'.yellow unless (@pkg.name == 'crew_mvdir') || CREW_DISABLE_MVDIR
|
|
|
|
if File.executable?("#{CREW_PREFIX}/bin/rsync") && system("#{CREW_PREFIX}/bin/rsync --version > /dev/null")
|
|
# rsync src path needs a trailing slash
|
|
src << '/' unless src.end_with?('/')
|
|
# Check for ACLs support.
|
|
rsync_version = `rsync --version`.chomp
|
|
if rsync_version.include?('ACLs') && !rsync_version.include?('no ACLs')
|
|
system 'rsync', "-ah#{@verbose}HAXW", '--remove-source-files', src, dst, exception: true
|
|
else
|
|
system 'rsync', "-ah#{@verbose}HXW", '--remove-source-files', src, dst, exception: true
|
|
end
|
|
else
|
|
system "cd #{src}; tar -cf - ./* | (cd #{dst}; tar -x#{@verbose}p --keep-directory-symlink -f -)", exception: true
|
|
end
|
|
end
|
|
else
|
|
abort "#{src} directory does not exist.".lightred
|
|
end
|
|
end
|
|
|
|
def install_package(pkgdir)
|
|
Dir.chdir pkgdir do
|
|
# install filelist, dlist and binary files
|
|
puts 'Performing install...'
|
|
|
|
FileUtils.mv 'dlist', File.join(CREW_META_PATH, "#{@pkg.name}.directorylist"), verbose: @fileutils_verbose
|
|
FileUtils.mv 'filelist', File.join(CREW_META_PATH, "#{@pkg.name}.filelist"), verbose: @fileutils_verbose
|
|
|
|
unless CREW_NOT_LINKS || @pkg.no_links?
|
|
brokensymlinks = `find . -type l -exec test ! -e {} \\; -print`.chomp
|
|
unless brokensymlinks.to_s.empty?
|
|
puts 'There are broken symlinks. Will try to fix.'.orange if @opt_verbose
|
|
brokensymlinks.each_line(chomp: true) do |fixlink|
|
|
brokentarget = `readlink -n #{fixlink}`.chomp
|
|
puts "Attempting fix of: #{fixlink.delete_prefix('.')} -> #{brokentarget}".orange if @opt_verbose
|
|
fixedtarget = brokentarget.delete_prefix(CREW_DEST_DIR)
|
|
fixedlink_loc = File.join(pkgdir, fixlink.delete_prefix('.'))
|
|
# If no changes were made, don't replace symlink
|
|
unless fixedtarget == brokentarget
|
|
FileUtils.ln_sf fixedtarget, fixedlink_loc
|
|
puts "Fixed: #{fixedtarget} -> #{fixlink.delete_prefix('.')}".orange if @opt_verbose
|
|
end
|
|
end
|
|
end
|
|
if File.executable?("#{CREW_PREFIX}/bin/rdfind")
|
|
puts 'Using rdfind to convert duplicate files to hard links.'
|
|
system 'rdfind -removeidentinode true -makesymlinks false -makehardlinks true -makeresultsfile false .'
|
|
end
|
|
end
|
|
|
|
install_files(".#{CREW_PREFIX}") if Dir.exist?(".#{CREW_PREFIX}")
|
|
install_files(".#{HOME}", HOME) if Dir.exist?(".#{HOME}")
|
|
end
|
|
end
|
|
|
|
def resolve_dependencies_and_install
|
|
@resolve_dependencies_and_install = 1
|
|
|
|
# Process preflight block to see if package should even
|
|
# be downloaded or installed.
|
|
pre_flight
|
|
|
|
begin
|
|
origin = @pkg.name
|
|
|
|
@to_postinstall = []
|
|
resolve_dependencies
|
|
|
|
search origin, true
|
|
install
|
|
@to_postinstall.append(@pkg.name)
|
|
@to_postinstall.each do |dep|
|
|
search dep
|
|
post_install
|
|
end
|
|
rescue InstallError => e
|
|
abort "#{@pkg.name} failed to install: #{e}".lightred
|
|
ensure
|
|
# cleanup
|
|
unless @opt_keep
|
|
FileUtils.rm_rf Dir["#{CREW_BREW_DIR}/*"]
|
|
FileUtils.mkdir_p "#{CREW_BREW_DIR}/dest" # this is a little ugly, feel free to find a better way
|
|
end
|
|
end
|
|
|
|
# Warn of possible segfaults for older packages on AMD StoneyRidge platforms
|
|
# Family 21 identifies AMD Bulldozer/Piledriver/Steamroller/Excavator µArchs
|
|
puts <<~EOT.yellow if CREW_IS_AMD && CPUINFO['cpu family'] == '21'
|
|
Notice: You are running an AMD StoneyRidge device; due to some bugs some
|
|
older packages may fail with a segmentation fault and need to be rebuilt.
|
|
|
|
If this happens, please report them to:
|
|
https://github.com/chromebrew/chromebrew/issues
|
|
|
|
Otherwise, rebuilding from source (1) or disabling ASLR (2) usually solves the issue:
|
|
(1) Run `crew reinstall -s #{@pkg.name}` to rebuild the package from source,
|
|
__OR__
|
|
(2) Execute `echo 0 | sudo tee /proc/sys/kernel/randomize_va_space` to disable ASLR.
|
|
Warning: Disabling ASLR may create security issues, use it at your own risk!
|
|
EOT
|
|
|
|
puts "#{@pkg.name.capitalize} installed!".lightgreen
|
|
@resolve_dependencies_and_install = 0
|
|
end
|
|
|
|
def resolve_dependencies
|
|
@dependencies = @pkg.get_deps_list(return_attr: true)
|
|
|
|
# compare dependency version with required range (if installed)
|
|
@dependencies.each do |dep|
|
|
dep_name = dep.keys[0]
|
|
dep_info = @device[:installed_packages].select {|pkg| pkg[:name] == dep_name } [0]
|
|
|
|
# skip if dependency is not installed
|
|
next unless dep_info
|
|
|
|
_tags, version_check = dep.values[0]
|
|
installed_version = dep_info[:version]
|
|
|
|
next unless version_check
|
|
|
|
# abort if the range is not fulfilled
|
|
abort unless version_check.call(installed_version)
|
|
end
|
|
|
|
# leave only dependency names (remove all package attributes returned by @pkg.get_deps_list)
|
|
@dependencies.map!(&:keys).flatten!
|
|
|
|
# abort if we have incompatible dependencies
|
|
abort "Some dependencies are not compatible with your device architecture (#{ARCH}). Unable to continue.".lightred if @dependencies.any? { |dep| !Package.load_package("#{CREW_PACKAGES_PATH}/#{dep}.rb").compatible? }
|
|
|
|
# leave only not installed packages in dependencies
|
|
@dependencies.reject! { |dep_name| @device[:installed_packages].any? { |pkg| pkg[:name] == dep_name } }
|
|
|
|
# run preflight check for dependencies
|
|
@dependencies.each do |dep_name|
|
|
dep_pkg_path = File.join(CREW_PACKAGES_PATH, "#{dep_name}.rb")
|
|
Package.load_package(dep_pkg_path, dep_name).preflight
|
|
end
|
|
|
|
return if @dependencies.empty?
|
|
|
|
puts 'The following packages also need to be installed: '
|
|
|
|
@dependencies.each do |dep|
|
|
abort "Dependency #{dep} was not found.".lightred unless File.file?( File.join(CREW_PACKAGES_PATH, "#{dep}.rb") )
|
|
end
|
|
|
|
puts @dependencies.join(' ')
|
|
|
|
if @opt_force
|
|
puts 'Proceeding with dependency package installation...'.orange
|
|
else
|
|
print 'Do you agree? [Y/n] '
|
|
response = $stdin.gets.chomp.downcase
|
|
case response
|
|
when 'n', 'no'
|
|
abort 'No changes made.'
|
|
when '', 'y', 'yes'
|
|
puts 'Proceeding...'
|
|
else
|
|
puts "I don't understand `#{response}`. :(".lightred
|
|
abort 'No changes made.'
|
|
end
|
|
end
|
|
|
|
@dependencies.each do |dep|
|
|
search dep
|
|
print_current_package
|
|
install
|
|
end
|
|
if @resolve_dependencies_and_install.eql?(1) || @resolve_dependencies_and_build.eql?(1)
|
|
@to_postinstall = @dependencies
|
|
else
|
|
# Make sure the sommelier postinstall happens last so the messages
|
|
# from that are not missed by users.
|
|
@dependencies.partition { |v| v != 'sommelier' }.reduce(:+)
|
|
@dependencies.each do |dep|
|
|
search dep
|
|
post_install
|
|
end
|
|
end
|
|
end
|
|
|
|
def install
|
|
if !@pkg.in_upgrade && @device[:installed_packages].any? { |pkg| pkg[:name] == @pkg.name }
|
|
puts "Package #{@pkg.name} already installed, skipping...".lightgreen
|
|
return
|
|
end
|
|
|
|
unless @pkg.is_fake?
|
|
meta = download
|
|
target_dir = unpack meta
|
|
if meta[:source]
|
|
# build from source and place binaries at CREW_DEST_DIR
|
|
# CREW_DEST_DIR contains usr/local/... hierarchy
|
|
build_and_preconfigure target_dir
|
|
|
|
# prepare filelist and dlist at CREW_DEST_DIR
|
|
prepare_package CREW_DEST_DIR
|
|
|
|
# use CREW_DEST_DIR
|
|
dest_dir = CREW_DEST_DIR
|
|
else
|
|
# use extracted binary directory
|
|
dest_dir = target_dir
|
|
end
|
|
end
|
|
|
|
# Make backup of installed packages json file.
|
|
# If this fails, the install should fail before we create any
|
|
# damage, and we should roughly be at maximal disk space usage at this
|
|
# point anyways.
|
|
FileUtils.cp File.join(CREW_CONFIG_PATH, 'device.json'), "#{CREW_CONFIG_PATH}/device.json.tmp"
|
|
|
|
# remove it just before the file copy
|
|
if @pkg.in_upgrade
|
|
puts 'Removing since upgrade or reinstall...'
|
|
remove @pkg.name
|
|
end
|
|
|
|
unless @pkg.is_fake?
|
|
# perform pre-install process
|
|
pre_install dest_dir
|
|
|
|
# perform install process
|
|
install_package dest_dir
|
|
|
|
unless (@resolve_dependencies_and_install == 1) || (@resolve_dependencies_and_build == 1)
|
|
# perform post-install process
|
|
post_install
|
|
end
|
|
end
|
|
|
|
# add to installed packages
|
|
@device[:installed_packages].push(name: @pkg.name, version: @pkg.version, binary_sha256: @pkg.get_binary_sha256(@device[:architecture]))
|
|
File.open("#{CREW_CONFIG_PATH}/device.json.tmp", 'w') do |file|
|
|
output = JSON.parse @device.to_json
|
|
file.write JSON.pretty_generate(output)
|
|
end
|
|
# Copy over original if the write to the tmp file succeeds.
|
|
FileUtils.cp "#{CREW_CONFIG_PATH}/device.json.tmp", File.join(CREW_CONFIG_PATH, 'device.json')
|
|
FileUtils.rm "#{CREW_CONFIG_PATH}/device.json.tmp"
|
|
end
|
|
|
|
def resolve_dependencies_and_build
|
|
@resolve_dependencies_and_build = 1
|
|
|
|
@to_postinstall = []
|
|
begin
|
|
origin = @pkg.name
|
|
|
|
# mark current package as which is required to compile from source
|
|
@pkg.build_from_source = true
|
|
resolve_dependencies
|
|
@to_postinstall.each do |dep|
|
|
search dep
|
|
post_install
|
|
end
|
|
search origin, true
|
|
build_package CREW_LOCAL_BUILD_DIR
|
|
rescue InstallError => e
|
|
abort "#{@pkg.name} failed to build: #{e}".lightred
|
|
ensure
|
|
# cleanup
|
|
unless @opt_keep
|
|
FileUtils.rm_rf Dir["#{CREW_BREW_DIR}/*"], verbose: @fileutils_verbose
|
|
FileUtils.mkdir_p "#{CREW_BREW_DIR}/dest", verbose: @fileutils_verbose # this is a little ugly, feel free to find a better way
|
|
end
|
|
end
|
|
puts "#{@pkg.name} is built!".lightgreen
|
|
@resolve_dependencies_and_build = 0
|
|
end
|
|
|
|
def build_package(crew_archive_dest)
|
|
# download source codes and unpack it
|
|
meta = download
|
|
target_dir = unpack meta
|
|
|
|
# build from source and place binaries at CREW_DEST_DIR
|
|
build_and_preconfigure target_dir
|
|
|
|
# call check method here. this check method is called by this function only,
|
|
# therefore it is possible place time consuming tests in the check method.
|
|
if Dir.exist? target_dir
|
|
Dir.chdir target_dir do
|
|
@pkg.check
|
|
end
|
|
end
|
|
|
|
# prepare filelist and dlist at CREW_DEST_DIR
|
|
prepare_package CREW_DEST_DIR
|
|
|
|
# build package from filelist, dlist and binary files in CREW_DEST_DIR
|
|
puts 'Archiving...'
|
|
archive_package crew_archive_dest
|
|
end
|
|
|
|
def archive_package(crew_archive_dest)
|
|
# Check to see that there is a working zstd
|
|
if File.file?("#{CREW_PREFIX}/bin/zstd")
|
|
crew_prefix_zstd_available = File.file?("#{CREW_PREFIX}/bin/zstd") ? true : nil
|
|
end
|
|
if @pkg.no_zstd? || !crew_prefix_zstd_available
|
|
puts 'Using xz to compress package. This may take some time.'.lightblue
|
|
pkg_name = "#{@pkg.name}-#{@pkg.version}-chromeos-#{@device[:architecture]}.tar.xz"
|
|
Dir.chdir CREW_DEST_DIR do
|
|
system "tar c#{@verbose}Jf #{crew_archive_dest}/#{pkg_name} *"
|
|
end
|
|
else
|
|
puts 'Using zstd to compress package. This may take some time.'.lightblue
|
|
pkg_name = "#{@pkg.name}-#{@pkg.version}-chromeos-#{@device[:architecture]}.tar.zst"
|
|
Dir.chdir CREW_DEST_DIR do
|
|
# Using same zstd compression options as Arch, which privilege
|
|
# decompression speed over compression speed.
|
|
# See https://lists.archlinux.org/pipermail/arch-dev-public/2019-March/029542.html
|
|
# Use nice so that user can (possibly) do other things during compression.
|
|
if crew_prefix_zstd_available
|
|
puts 'Using standard zstd'.lightblue if @opt_verbose
|
|
system "tar c#{@verbose} * | nice -n 20 #{CREW_PREFIX}/bin/zstd -c -T0 --ultra -20 - > #{crew_archive_dest}/#{pkg_name}"
|
|
end
|
|
end
|
|
end
|
|
system "sha256sum #{crew_archive_dest}/#{pkg_name} > #{crew_archive_dest}/#{pkg_name}.sha256"
|
|
# Copy package file for the successfully generated package to CREW_LOCAL_REPO_ROOT only if force is set.
|
|
if @opt_force
|
|
FileUtils.cp "#{CREW_PACKAGES_PATH}/#{@pkg_name}.rb", "#{CREW_LOCAL_REPO_ROOT}/packages/"
|
|
puts "The package file for #{@pkg_name} used has been copied to #{CREW_LOCAL_REPO_ROOT}/packages/".lightblue
|
|
if @device[:installed_packages].any? { |pkg| pkg[:name] == @pkg.name }
|
|
puts "#{@pkg_name} will now be upgraded...".lightgreen
|
|
@pkg.in_upgrade = true
|
|
@pkg.build_from_source = false
|
|
resolve_dependencies_and_install
|
|
@pkg.in_upgrade = false
|
|
else
|
|
puts "#{@pkg_name} will now be installed...".lightgreen
|
|
@pkg.build_from_source = false
|
|
resolve_dependencies_and_install
|
|
end
|
|
end
|
|
end
|
|
|
|
def remove(pkg_name)
|
|
# make sure the package is actually installed
|
|
unless @device[:installed_packages].any? { |pkg| pkg[:name] == pkg_name } || File.file?(File.join(CREW_META_PATH, "#{pkg_name}.filelist"))
|
|
puts "Package #{pkg_name} isn't installed.".lightred
|
|
return
|
|
end
|
|
|
|
# Perform any operations required prior to package removal.
|
|
search pkg_name, true
|
|
@pkg.preremove unless @in_fixup
|
|
|
|
# Preserve CREW_ESSENTIAL_FILES and make sure they are real files
|
|
# and not symlinks, because preserving symlinked libraries does not
|
|
# prevent breakage.
|
|
CREW_ESSENTIAL_FILES.each do |file|
|
|
next unless File.symlink?("#{CREW_LIB_PREFIX}/#{file}")
|
|
|
|
canonicalized_file = `readlink -m #{CREW_LIB_PREFIX}/#{file}`.chomp
|
|
if File.file?(canonicalized_file) && canonicalized_file.include?(CREW_PREFIX)
|
|
puts "Replacing symlinked essential #{file} with hard link to #{canonicalized_file} to avoid breakage.".lightblue if @opt_verbose
|
|
FileUtils.ln(canonicalized_file, "#{CREW_LIB_PREFIX}/#{file}", force: true)
|
|
end
|
|
end
|
|
|
|
conflicts = determine_conflicts(Dir.pwd, pkg_name)
|
|
|
|
# if the filelist exists, remove the files and directories installed by the package
|
|
if File.file?(File.join(CREW_META_PATH, "#{pkg_name}.filelist"))
|
|
Dir.chdir CREW_CONFIG_PATH do
|
|
# remove all files installed by the package
|
|
File.foreach("meta/#{pkg_name}.filelist", chomp: true) do |line|
|
|
# Do not remove essential files which crew (and dependencies)
|
|
# rely on, especially during package upgrades or reinstalls.
|
|
# These essential files are enumerated in const.rb as
|
|
# CREW_ESSENTIAL_FILES.
|
|
if CREW_ESSENTIAL_FILES.include?(File.basename(line))
|
|
puts "Removing #{line} will break crew. It was #{'NOT'.lightred} deleted." if @opt_verbose
|
|
else
|
|
puts "Removing file #{line}".lightred if @opt_verbose
|
|
puts "filelist contains #{line}".lightred if @opt_verbose && !line.include?(CREW_PREFIX)
|
|
if line.start_with?(CREW_PREFIX)
|
|
if conflicts.include?(line)
|
|
puts "#{line} is in another package. It will not be removed during the removal of #{pkg_name}".orange
|
|
else
|
|
FileUtils.rm_rf line
|
|
end
|
|
end
|
|
end
|
|
end
|
|
|
|
# remove all directories installed by the package
|
|
File.foreach("meta/#{pkg_name}.directorylist", chomp: true) do |line|
|
|
puts "directorylist contains #{line}".lightred if @opt_verbose && !line.include?(CREW_PREFIX)
|
|
next unless Dir.exist?(line) && Dir.empty?(line) && line.include?(CREW_PREFIX)
|
|
|
|
puts "Removing directory #{line}".lightred if @opt_verbose
|
|
FileUtils.rmdir(line)
|
|
end
|
|
|
|
# remove the file and directory list
|
|
FileUtils.rm_f Dir["meta/#{pkg_name}.{file,directory}list"]
|
|
end
|
|
end
|
|
|
|
# remove from installed packages
|
|
puts "Removing package #{pkg_name}".lightred if @opt_verbose
|
|
@device[:installed_packages].delete_if { |elem| elem[:name] == pkg_name }
|
|
|
|
# update the device manifest
|
|
File.write "#{CREW_CONFIG_PATH}/device.json", JSON.pretty_generate(JSON.parse(@device.to_json))
|
|
|
|
search pkg_name, true
|
|
@pkg.remove unless @in_fixup
|
|
|
|
puts "#{pkg_name.capitalize} removed!".lightgreen
|
|
end
|
|
|
|
def print_deps_tree(args)
|
|
warn 'Walking through dependencies recursively, this may take a while...', ''
|
|
|
|
# dep_hash: Hash object returned by @pkg.get_deps_list
|
|
dep_hash = @pkg.get_deps_list(hash: true, include_build_deps: args['--include-build-deps'] || 'auto', exclude_buildessential: args['--exclude-buildessential'])
|
|
|
|
# convert returned hash to json and format it
|
|
json_view = JSON.pretty_generate(dep_hash)
|
|
|
|
# convert formatted json string to tree structure
|
|
tree_view = json_view.gsub(/\{\s*/m, '└─────').gsub(/[\[\]{},":]/, '').gsub(/^\s*$\n/, '').gsub(/\s*$/, '')
|
|
|
|
# add pipe char to connect endpoints and starting points, improve readability
|
|
# find the horizontal location of all arrow symbols
|
|
index_with_pipe_char = tree_view.lines.map { |line| line.index('└') }.compact.uniq
|
|
|
|
# determine whatever a pipe char should be added according to the horizontal location of arrow symbols
|
|
tree_view = tree_view.lines.each_with_index.map do |line, line_i|
|
|
index_with_pipe_char.each do |char_i|
|
|
# check if there have any non-space char (pkg_names) between starting point ([line_i][char_i]) and endpoint vertically ([next_arrow_line_offset][char_i])
|
|
# (used to determine if the starting point and endpoint are in same branch, use pipe char to connect them if true)
|
|
next_arrow_line_offset = tree_view.lines[line_i..].index { |l| l[char_i] == '└' }
|
|
have_line_with_non_empty_char = tree_view.lines[line_i + 1..line_i + next_arrow_line_offset.to_i - 1].any? { |l| l[char_i].nil? or l[char_i] =~ /\S/ }
|
|
|
|
line[char_i] = '│' if next_arrow_line_offset && (line[char_i] == ' ') && !have_line_with_non_empty_char
|
|
end
|
|
next line
|
|
end.join
|
|
|
|
# replace arrow symbols with a tee symbol on branch intersection
|
|
tree_view = tree_view.lines.each_with_index.map do |line, line_i|
|
|
# orig_arrow_index_connecter: the horizontal location of the arrow symbol used to connect parent branch
|
|
#
|
|
# example:
|
|
# └───┬─chrome
|
|
# └─────buildessential
|
|
# ^
|
|
orig_arrow_index_connecter = line.index('└')
|
|
# orig_arrow_index_newbranch: the horizontal location of the "box drawing char" symbol MIGHT be
|
|
# required to convert to tee char in order to connect child branch,
|
|
# located at 3 chars later of orig_arrow_index_connecter
|
|
#
|
|
# example:
|
|
# v
|
|
# └─────chrome
|
|
# └─────buildessential
|
|
#
|
|
# which might need to be convert to:
|
|
# └───┬─chrome
|
|
# └─────buildessential
|
|
orig_arrow_index_newbranch = orig_arrow_index_connecter + 4
|
|
|
|
# if the char under the processing arrow symbol (orig_arrow_index_connecter) is also arrow or pipe, change the processing char to tee symbol
|
|
line[orig_arrow_index_connecter] = '├' if orig_arrow_index_connecter && tree_view.lines[line_i + 1].to_s[orig_arrow_index_connecter] =~ (/[└│]/)
|
|
# if the char under the processing arrow symbol (orig_arrow_index_newbranch) is also arrow or pipe, change the processing char to tee symbol
|
|
line[orig_arrow_index_newbranch] = '┬' if orig_arrow_index_newbranch && tree_view.lines[line_i + 1].to_s[orig_arrow_index_newbranch] =~ (/[└├]/)
|
|
next line # return modified line
|
|
end.join
|
|
|
|
if String.use_color
|
|
puts <<~EOT, ''
|
|
\e[45m \e[0m: satisfied dependency
|
|
\e[46m \e[0m: build dependency
|
|
\e[47m \e[0m: runtime dependency
|
|
EOT
|
|
# (the first string in each #{} is used for commenting only, will not be included in output)
|
|
|
|
# replace special symbols returned by @pkg.get_deps_list to actual color code
|
|
tree_view.gsub!(/\*(.+)\*/, '\1'.lightcyan)
|
|
tree_view.gsub!(/\+(.+)\+/, "\e[45m\\1\e[0m")
|
|
end
|
|
|
|
puts tree_view
|
|
end
|
|
|
|
def upload(pkg_name = nil)
|
|
abort "\nGITLAB_TOKEN environment variable not set.\n".lightred if ENV.fetch('GITLAB_TOKEN', nil).nil?
|
|
|
|
packages = pkg_name
|
|
binary_compression_set = nil
|
|
gitlab_token = ENV.fetch('GITLAB_TOKEN', nil)
|
|
base_url = 'https://gitlab.com/api/v4/projects/26210301/packages/generic'
|
|
|
|
if pkg_name.nil?
|
|
%w[.tar.xz .tar.zst].each do |ext|
|
|
packages += `find #{CREW_LOCAL_REPO_ROOT}/release/*/*#{ext} -exec basename -s #{ext} {} + | cut -d- -f1 | LC_ALL=C sort | uniq | xargs`
|
|
packages += ' '
|
|
end
|
|
abort 'No package binaries found.'.lightred if packages.empty?
|
|
end
|
|
|
|
packages.strip!
|
|
|
|
[packages].each do |package|
|
|
%w[x86_64 i686 armv7l].each do |arch|
|
|
release_dir = "#{CREW_LOCAL_REPO_ROOT}/release/#{arch}"
|
|
pkg_file = "#{CREW_LOCAL_REPO_ROOT}/packages/#{package}.rb"
|
|
new_tarfile = Dir["#{release_dir}/#{package}-*-chromeos-#{arch}.{tar.xz,tar.zst}"].max_by { |f| File.mtime(f) }
|
|
if new_tarfile.nil?
|
|
puts "#{release_dir}/#{package}-#-chromeos-#{arch}.(tar.xz|tar.zst) not found.\n".lightred
|
|
next
|
|
end
|
|
if binary_compression_set.nil?
|
|
ext = File.extname(new_tarfile)
|
|
puts "Setting binary compression in #{pkg_file}..."
|
|
# Add binary compression setting, and add the line if it doesn't exist.
|
|
if File.read(pkg_file).include?('binary_compression')
|
|
puts "sed -i \"s/binary_compression.*/binary_compression 'tar#{ext}'/\" #{pkg_file}" if @opt_verbose
|
|
system "sed -i \"s/binary_compression.*/binary_compression 'tar#{ext}'/\" #{pkg_file}"
|
|
elsif File.read(pkg_file).include?('source_sha256')
|
|
puts "sed -i \"/source_sha256/a \\\ \\\ binary_compression 'tar#{ext}'\" #{pkg_file}" if @opt_verbose
|
|
system "sed -i \"/source_sha256/a \\\ \\\ binary_compression 'tar#{ext}'\" #{pkg_file}"
|
|
elsif File.read(pkg_file).include?('git_hashtag')
|
|
puts "sed -i \"/git_hashtag/a \\\ \\\ binary_compression 'tar#{ext}'\" #{pkg_file}" if @opt_verbose
|
|
system "sed -i \"/git_hashtag/a \\\ \\\ binary_compression 'tar#{ext}'\" #{pkg_file}"
|
|
else
|
|
puts "Unable to tell where to add \"binary_compression 'tar#{ext}'\" to #{pkg_file}. Please add it and manually.".lightblue
|
|
end
|
|
binary_compression_set = 1
|
|
end
|
|
puts "Package: #{package}, Arch: #{arch}".yellow
|
|
puts 'Generating sha256sum ...'
|
|
new_sha256 = Digest::SHA256.hexdigest(File.read(new_tarfile))
|
|
puts "Uploading #{new_tarfile} ..."
|
|
noname = new_tarfile.split("#{package}-").last
|
|
new_version = noname.split('-chromeos').first
|
|
new_url = "#{base_url}/#{package}/#{new_version}_#{arch}/#{new_tarfile}".gsub("#{release_dir}/", '')
|
|
token_label = gitlab_token.split('-').first == 'glpat' ? 'PRIVATE-TOKEN' : 'DEPLOY-TOKEN'
|
|
puts "curl -# --header \"#{token_label}: #{gitlab_token}\" --upload-file \"#{new_tarfile}\" \"#{new_url}\" | cat" if @opt_verbose
|
|
output = `curl -# --header "#{token_label}: #{gitlab_token}" --upload-file "#{new_tarfile}" "#{new_url}" | cat`.chomp
|
|
if output.include?('201 Created')
|
|
puts "curl -Ls #{new_url} | sha256sum" if @opt_verbose
|
|
upstream_sha256 = `curl -Ls #{new_url} | sha256sum`.chomp.split.first
|
|
if upstream_sha256 == new_sha256
|
|
puts output.lightgreen
|
|
else
|
|
if @opt_verbose
|
|
puts "expected sha256 hash=#{new_sha256}"
|
|
puts "upstream sha256 hash=#{upstream_sha256}"
|
|
end
|
|
puts "#{output}. Checksum mismatch. Skipping binary_sha256 update in #{pkg_file}...".lightred
|
|
next
|
|
end
|
|
else
|
|
puts output.lightred
|
|
puts "#{output}. Unable to upload. Skipping binary_sha256 update in #{pkg_file}...".lightred
|
|
next
|
|
end
|
|
old_sha256 = `grep -m 1 #{arch}: #{pkg_file} 2> /dev/null`.chomp
|
|
if old_sha256.empty?
|
|
unless File.readlines(pkg_file).grep(/binary_sha256/).any?
|
|
if @opt_verbose
|
|
puts "sed -e '/binary_compression/ a\\
|
|
\\
|
|
\\ \\ binary_sha256({' -i #{pkg_file}"
|
|
end
|
|
system "sed -e '/binary_compression/ a\\
|
|
\\
|
|
\\ \\ binary_sha256({' -i #{pkg_file}"
|
|
end
|
|
puts "Adding binary_sha256 to #{pkg_file}..."
|
|
puts "#{arch}: '#{new_sha256}'"
|
|
unless new_sha256.empty?
|
|
update_sha256(pkg_file, arch, new_sha256)
|
|
update_sha256(pkg_file, 'aarch64', new_sha256) if arch.eql?('armv7l')
|
|
end
|
|
else
|
|
old_sha256 = old_sha256.split("'")[1]
|
|
if old_sha256 == new_sha256
|
|
puts "Skipping binary_sha256 update in #{pkg_file}..."
|
|
else
|
|
puts "Updating binary_sha256 in #{pkg_file}..."
|
|
puts "from: #{arch}: '#{old_sha256}'"
|
|
puts " to: #{arch}: '#{new_sha256}'"
|
|
puts "sed -i 's/#{old_sha256}/#{new_sha256}/g' #{pkg_file}" if @opt_verbose
|
|
system "sed -i 's/#{old_sha256}/#{new_sha256}/g' #{pkg_file}"
|
|
end
|
|
end
|
|
# Use rubocop to sanitize package file, and let errors get flagged.
|
|
system 'yes | crew install ruby_rubocop' unless @device[:installed_packages].any? { |elem| elem[:name] == 'ruby_rubocop' }
|
|
puts "Using rubocop to sanitize #{pkg_file} .".orange
|
|
system "rubocop -c #{File.join(CREW_LOCAL_REPO_ROOT, '.rubocop.yml')} -A #{pkg_file}", exception: true
|
|
end
|
|
end
|
|
end
|
|
|
|
def update_sha256(package, key, value)
|
|
case key
|
|
when 'aarch64'
|
|
leading_space = '\\ \\ \\ \\ '
|
|
when 'armv7l', 'x86_64'
|
|
leading_space = '\\ \\ \\ \\ \\ '
|
|
when 'i686'
|
|
leading_space = '\\ \\ \\ \\ \\ \\ \\ '
|
|
end
|
|
comma = key == 'x86_64' ? '\\n\\ \\ })' : ','
|
|
if File.readlines(package).grep(/#{key}:/).any?
|
|
puts "sed -e \"/#{key}:.*['\"][0-9a-f]*['\"]/c#{leading_space}#{key}: '#{value}'#{comma}\" -i #{package}" if @opt_verbose
|
|
system "sed -e \"/#{key}:.*['\"][0-9a-f]*['\"]/c#{leading_space}#{key}: '#{value}'#{comma}\" -i #{package}"
|
|
else
|
|
puts "sed -e \"/binary_sha256.*({/a#{leading_space}#{key}: '#{value}'#{comma}\" -i #{package}" if @opt_verbose
|
|
system "sed -e \"/binary_sha256.*({/a#{leading_space}#{key}: '#{value}'#{comma}\" -i #{package}"
|
|
end
|
|
end
|
|
|
|
def copy_package(pkg_name, prompt_msg = '')
|
|
next_pkg = nil
|
|
if @opt_force
|
|
FileUtils.cp "#{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb", "#{CREW_PACKAGES_PATH}/"
|
|
puts "\nCopied #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb to #{CREW_PACKAGES_PATH}.\n".lightgreen
|
|
else
|
|
# This pulls the operation from the calling function
|
|
operation = caller_locations(1, 2)[1].to_s.split[3].split('_')[0]
|
|
puts prompt_msg.yellow
|
|
print "\nWould you like to copy #{pkg_name}.rb to crew and start the #{operation}? [Y/n] ".yellow
|
|
response = $stdin.gets.chomp.downcase
|
|
case response
|
|
when 'n', 'no'
|
|
puts "#{operation.capitalize} skipped."
|
|
next_pkg = true
|
|
when '', 'y', 'yes'
|
|
FileUtils.cp "#{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb", "#{CREW_PACKAGES_PATH}/"
|
|
puts "\nCopied #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb to #{CREW_PACKAGES_PATH}.\n".lightgreen
|
|
else
|
|
puts "I don't understand `#{response}`. :(".lightred
|
|
puts "#{operation.capitalize} skipped."
|
|
next_pkg = true
|
|
end
|
|
end
|
|
return next_pkg
|
|
end
|
|
|
|
def check_package(pkg_name)
|
|
return unless Dir.exist? CREW_LOCAL_REPO_ROOT
|
|
return copy_package(pkg_name) if @opt_force
|
|
|
|
# Prompt to copy the local repo package to crew if the package is not found.
|
|
if !File.file?("#{CREW_PACKAGES_PATH}/#{pkg_name}.rb") && File.file?("#{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb")
|
|
prompt_msg = "\nThe crew package #{pkg_name} does not exist."
|
|
return copy_package(pkg_name, prompt_msg)
|
|
end
|
|
|
|
# Compare local repo package to the crew repo package and prompt to copy if necessary to prepare for the operation.
|
|
crew_package_updated = ''
|
|
Dir.chdir CREW_PACKAGES_PATH do
|
|
crew_package_updated = `git diff #{CREW_PACKAGES_PATH}/#{pkg_name}.rb`.chomp
|
|
end
|
|
local_package_updated = ''
|
|
Dir.chdir CREW_LOCAL_REPO_ROOT do
|
|
local_package_updated = `git diff #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb`.chomp
|
|
end
|
|
if local_package_updated != '' && crew_package_updated == ''
|
|
prompt_msg = "\n#{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb has been updated but the crew package is unchanged."
|
|
return copy_package(pkg_name, prompt_msg)
|
|
end
|
|
if local_package_updated != '' && crew_package_updated != '' && local_package_updated != crew_package_updated
|
|
prompt_msg = "\n#{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb has been updated and does not match the crew updated package."
|
|
return copy_package(pkg_name, prompt_msg)
|
|
end
|
|
end
|
|
|
|
def build_command(args)
|
|
abort 'Unable to locate local repo root directory. Change to a local chromebrew git repo directory and try again.'.lightred unless Dir.exist? CREW_LOCAL_REPO_ROOT
|
|
abort 'Change to a local chromebrew git repo directory and try again.'.lightred if CREW_PACKAGES_PATH.include?(CREW_LOCAL_REPO_ROOT)
|
|
unless Dir.exist? CREW_LOCAL_BUILD_DIR
|
|
if @opt_force
|
|
puts "Attempting to create local build directory at #{CREW_LOCAL_BUILD_DIR} ...".orange
|
|
FileUtils.mkdir_p CREW_LOCAL_BUILD_DIR
|
|
else
|
|
abort "Unable to locate local build directory #{CREW_LOCAL_BUILD_DIR}. It will be created if you build with the '-f' flag.".lightred
|
|
end
|
|
end
|
|
abort "#{CREW_LOCAL_BUILD_DIR} is not writable.".lightred unless File.writable?(CREW_LOCAL_BUILD_DIR)
|
|
args['<name>'].each do |name|
|
|
# If a package file is explicitly passed, then use that package file, whereever it is.
|
|
if name.include?('.rb') && File.file?(name)
|
|
FileUtils.cp name, "#{CREW_PACKAGES_PATH}/"
|
|
@pkg_name = File.basename(name).gsub('.rb', '')
|
|
else
|
|
@pkg_name = name
|
|
end
|
|
next if check_package(@pkg_name)
|
|
|
|
search @pkg_name
|
|
print_current_package @opt_verbose
|
|
next unless @pkg_name
|
|
|
|
# Process preflight block to see if package should be built
|
|
pre_flight
|
|
|
|
if !@pkg.is_fake? && @pkg.compatible? && @pkg.source?(ARCH) && ( @pkg.no_source_build? || @pkg.source_url.to_s.upcase != 'SKIP' ) && !@pkg.no_compile_needed?
|
|
resolve_dependencies_and_build
|
|
else
|
|
puts 'Unable to build a fake package. Skipping build.'.lightred if @pkg.is_fake?
|
|
puts "Package #{@pkg.name} is not compatible with your device architecture (#{ARCH}). Skipping build.".lightred unless @pkg.compatible?
|
|
puts 'Unable to build without source. Skipping build.'.lightred unless @pkg.source?(ARCH) && @pkg.source_url.to_s.upcase != 'SKIP'
|
|
puts 'Compile not needed. Skipping build.'.lightred if @pkg.no_compile_needed?
|
|
end
|
|
end
|
|
puts "Builds are located in #{CREW_LOCAL_BUILD_DIR}.".yellow
|
|
end
|
|
|
|
def const_command(args)
|
|
args['<name>'].each do |name|
|
|
Command.const(name)
|
|
end.empty? && Command.const(nil)
|
|
end
|
|
|
|
def deps_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
search @pkg_name
|
|
|
|
if args['--tree']
|
|
# call `print_deps_tree` (print dependency tree) if --tree is specified
|
|
print_deps_tree(args)
|
|
elsif args['--deep']
|
|
system "#{CREW_LIB_PATH}/tools/getrealdeps.rb #{name}"
|
|
else
|
|
# print dependencies according to the install order if --tree is not specified
|
|
puts @pkg.get_deps_list(include_build_deps: args['--include-build-deps'] || 'auto', exclude_buildessential: args['--exclude-buildessential'])
|
|
end
|
|
end
|
|
end
|
|
|
|
def download_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
search @pkg_name
|
|
@pkg.build_from_source = true if @opt_source
|
|
print_current_package @opt_verbose
|
|
download
|
|
end
|
|
end
|
|
|
|
def files_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
search @pkg_name
|
|
print_current_package
|
|
files name
|
|
end
|
|
end
|
|
|
|
def help_command(args)
|
|
Command.help(args['<command>'], args['<subcommand>'])
|
|
end
|
|
|
|
def install_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
# Exit early if package is already installed. This prevents the
|
|
# postinstall from being run for an already installed package.
|
|
if @device[:installed_packages].any? { |pkg| pkg[:name] == @pkg_name }
|
|
puts "Package #{@pkg_name} already installed, skipping...".lightgreen
|
|
next
|
|
end
|
|
next if check_package(@pkg_name)
|
|
search @pkg_name
|
|
print_current_package true
|
|
@pkg.build_from_source = true if @opt_source || @opt_recursive || CREW_BUILD_FROM_SOURCE
|
|
next unless @pkg_name
|
|
|
|
if @pkg.compatible?
|
|
resolve_dependencies_and_install
|
|
else
|
|
puts "Package #{@pkg.name} is not compatible with your device architecture (#{ARCH}). Skipping install.".lightred
|
|
end
|
|
end
|
|
end
|
|
|
|
def list_command(args)
|
|
Command.list(args['available'], args['installed'], args['compatible'], args['incompatible'], @opt_verbose)
|
|
end
|
|
|
|
def postinstall_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
search @pkg_name, true
|
|
if @device[:installed_packages].any? { |elem| elem[:name] == @pkg_name }
|
|
@pkg.postinstall
|
|
else
|
|
puts "Package #{@pkg_name} is not installed. :(".lightred
|
|
end
|
|
end
|
|
end
|
|
|
|
def prop_command(_)
|
|
prop
|
|
end
|
|
|
|
def reinstall_command(args)
|
|
args['<name>'].each do |name|
|
|
@pkg_name = name
|
|
next if check_package(@pkg_name)
|
|
search @pkg_name
|
|
print_current_package
|
|
@pkg.build_from_source = true if @opt_source || @opt_recursive || CREW_BUILD_FROM_SOURCE
|
|
next unless @pkg_name
|
|
|
|
if @pkg.compatible?
|
|
@pkg.in_upgrade = true
|
|
resolve_dependencies_and_install
|
|
@pkg.in_upgrade = false
|
|
else
|
|
puts "Package #{@pkg.name} is not compatible with your device architecture (#{ARCH}). Skipping reinstall.".lightred
|
|
end
|
|
end
|
|
end
|
|
|
|
def remove_command(args)
|
|
args['<name>'].each {|name| remove name }
|
|
end
|
|
|
|
def search_command(args)
|
|
args['<name>'].each do |name|
|
|
regexp_search name
|
|
end.empty? && list_packages
|
|
end
|
|
|
|
def sysinfo_command(_args)
|
|
# newer version of Chrome OS exports info to env by default
|
|
lsb_release = if File.file?('/etc/lsb-release')
|
|
File.read('/etc/lsb-release').scan(/^(.+?)=(.+)$/).to_h
|
|
else
|
|
# newer version of Chrome OS exports info to env by default
|
|
ENV
|
|
end
|
|
|
|
git_commit_message_format = '%h `%s (%cr)`'
|
|
|
|
sysinfo_markdown_header = <<~MDHEADER
|
|
<details><summary>Expand</summary>
|
|
|
|
MDHEADER
|
|
sysinfo_markdown_body = <<~MDBODY
|
|
- Architecture: `#{KERN_ARCH}` (`#{ARCH}`)
|
|
- Processor vendor: `#{CPUINFO['vendor_id'] || 'ARM'}`
|
|
- User space: `#{Dir.exist?('/lib64') ? '64' : '32'}-bit`
|
|
- Chromebrew Kernel version: `#{CREW_KERNEL_VERSION}`
|
|
- Chromebrew Running in Container: `#{CREW_IN_CONTAINER}`
|
|
|
|
- Chromebrew version: `#{CREW_VERSION}`
|
|
- Chromebrew prefix: `#{CREW_PREFIX}`
|
|
- Chromebrew libdir: `#{CREW_LIB_PREFIX}`
|
|
|
|
- Last update in local repository: #{`git -C '#{CREW_LIB_PATH}' show -s --format='#{git_commit_message_format}'`.chomp}
|
|
|
|
- OS variant: `#{lsb_release['CHROMEOS_RELEASE_NAME']}`
|
|
- OS version: `#{lsb_release['CHROMEOS_RELEASE_BUILDER_PATH']}`
|
|
- OS channel: `#{lsb_release['CHROMEOS_RELEASE_TRACK']}`
|
|
MDBODY
|
|
sysinfo_markdown_footer = <<~MDFOOTER
|
|
|
|
</details>
|
|
MDFOOTER
|
|
if @opt_verbose
|
|
puts sysinfo_markdown_header, sysinfo_markdown_body, sysinfo_markdown_footer
|
|
else
|
|
puts sysinfo_markdown_body.tr('`', '')
|
|
end
|
|
end
|
|
|
|
def test_command(args)
|
|
test_commands_path = "#{CREW_LIB_PATH}/tests/commands"
|
|
if args['<name>'].empty?
|
|
Dir["#{test_commands_path}/*.rb"].each do |name|
|
|
basename = File.basename(name, '.rb')
|
|
puts "Testing #{basename} command ...".yellow
|
|
system("ruby #{name}")
|
|
end
|
|
else
|
|
args['<name>'].each do |name|
|
|
basename = File.basename(name, '.rb')
|
|
name = basename if basename != name
|
|
if File.file?("#{test_commands_path}/#{name}.rb")
|
|
Dir.chdir(test_commands_path) do
|
|
puts "Testing #{name} command ...".yellow
|
|
system("ruby #{name}.rb")
|
|
end
|
|
else
|
|
puts "The #{name} command or test does not exist. Test skipped.".orange
|
|
end
|
|
end
|
|
end
|
|
end
|
|
|
|
def update_command(args)
|
|
if args['<compatible>']
|
|
generate_compatible
|
|
else
|
|
update
|
|
end
|
|
end
|
|
|
|
def upgrade_command(args) = upgrade(*args['<name>'], build_from_source: @opt_source)
|
|
|
|
def upload_command(args)
|
|
upload if args['<name>'].empty?
|
|
args['<name>'].each do |name|
|
|
search name
|
|
upload name if @pkg.name
|
|
end
|
|
end
|
|
|
|
def whatprovides_command(args)
|
|
args['<pattern>'].each do |name|
|
|
whatprovides name
|
|
end
|
|
end
|
|
|
|
def command?(name) = !!!name[/^[-<]/]
|
|
|
|
Signal.trap('INT') do
|
|
if CREW_CACHE_FAILED_BUILD && CREW_CACHE_ENABLED && @pkg.in_build
|
|
cache_build
|
|
ExitMessage.add 'The build was interrupted. The build directory was cached.'.lightred
|
|
exit 1
|
|
end
|
|
ExitMessage.add 'Interrupted!'.lightred
|
|
exit 1
|
|
end
|
|
|
|
load_json
|
|
command_name = args.select { |k, v| v && command?(k) }.keys[0]
|
|
send("#{command_name}_command", args)
|