Friday, November 13, 2020

Can't locate object method via package "1" (perhaps you forgot to load "1"?)

This Perl error means that when you use $obj->method() $obj == 1 instead of an instance of a Perl class. It doesn't work to do 1->my_method()

I triggered the error by failing to return $self from my constructor.

 

1 package NoError; 2 3 sub new { 4 my $class = shift; 5 my $self ={}; 6 bless $self,$class; 7 return $self; # the right way to do it 8 } 9 10 sub hi { return 'hi';} 11 12 1; 1 package OneError; 2 3 sub new { 4 my $class = shift; 5 my $self ={}; 6 bless $self,$class; # succeeds returns 1 7 } 8 9 sub hi { return 'hi';} 1 #!/usr/bin/perl 2 3 use FindBin; 4 use lib "$FindBin::Bin" ; 5 6 use OneError; 7 use NoError; 8 9 my $no_obj = NoError->new(); 10 print $no_obj->hi(), "\n"; # works fine new() 11 # returns $self 12 12 my $oe_obj = OneError->new(); 13 print $oe_obj->hi(), "\n"; # new() returns 1 14 # 1->hi() doesn't work

Gabor Szabo has a helpful blog about interpreting @_ in scalar context If you have $obj = @_; instead of ($obj) = @_; you get $obj == 1 , the number of elements in @_ rather than an actual object.

Sunday, July 28, 2019

Comparing many json files

I wrote a tool to compare JSON files , code and documentation are on github.

SAMPLE OUTPUT


  :color:red:file01.json
  :color:red:file02.json
  :color:red:file03.json
  :color:red:file04.json

  :fruit:apple:file01.json
  :fruit:cherry:file02.json
  :fruit:apple:file03.json
  :fruit:MISSING:file04.json

  :inspiration:art:Pablo Picasso:file01.json
  :inspiration:art:Frida Kahlo:file02.json
  :inspiration:art:Pablo Picasso:file03.json
  :inspiration:art:MISSING:file04.json

  :inspiration:music:Dead Kennedys:file01.json
  :inspiration:music:Dead Kennedys:file02.json
  :inspiration:music:Dead Kennedys:file03.json
  :inspiration:music:MISSING:file04.json

  :inspiration:tools:MISSING:file01.json
  :inspiration:tools:["hammer", "rack"]:file02.json
  :inspiration:tools:MISSING:file03.json
  :inspiration:tools:MISSING:file04.json

  :vegetable:MISSING:file01.json
  :vegetable:MISSING:file02.json
  :vegetable:spinach:file03.json
  :vegetable:MISSING:file04.json

SAMPLE INPUT

file01.json

{
   "color": "red",
   "fruit": "apple",
   "inspiration": {"art":"Pablo Picasso","music":"Dead Kennedys"}
  }

file02.json

  {
   "color": "red",
   "fruit": "cherry",
   "inspiration": {"art":"Frida Kahlo","music":"Dead Kennedys","tools":["hammer","rack"]}
  }

file03.json

 {
   "vegetable": "spinach",
   "color": "red",
   "fruit": "apple",
   "inspiration": {"art":"Pablo Picasso","music":"Dead Kennedys"}
  }

file04.json

{
   "color": "red"
  }

Porting python 2 to python 3 (2to3) without unit tests.

The existing guides for porting python 2 to python 3 generally say "...then you run your unit tests and fix any problems you find." This is awkward if you don't have unit tests. While, you will not escape without testing, (perhaps by your angry users in production), you can save little time by:

  1. Before the conversion, compile your code to python 2 , to establish it isn't already broken 
  2. After the conversion, see if you can compile your code to python 3 
  3. After the conversion, run pylint --py3k 

This snippet will compile your code as python2 , it is meant to be called with from a larger python 3 script with something like subprocess.check_output. This snippet will not catch run time errors like “undefined variable” . It will catch compile time errors like “bad indent”


#!/usr/bin/env python
import py_compile, sys
print("compiling for python 2 %s"%(sys.argv[1]))
py_compile.compile(sys.argv[1], doraise=True)

This sub-routine will compile your code as python3


def compile_python3_ok(file):
    try:
        py_compile.compile(file, doraise=True)
    except py_compile.PyCompileError as the_exception:
        msg = "\nERROR FAIL python3 compile %s\n"%(file)
        sys.stderr.write(msg)
        sys.stderr.write("\n"+ str(the_exception))

You can install pylint with


sudo apt install python3-pip
pip3 install pylint
export PATH=/home/$USER/.local/bin:$PATH

This snippit will run pylint with python 3 checkers


#!/us/bin/python3
import subprocess,sys

def run_(cmd): 
    try:
        log_me = subprocess.check_output(cmd.split(), stderr=subprocess.STDOUT)
        print(log_me)
        except subprocess.CalledProcessError as the_exception:
        msg = "ERROR FAIL %s\n"%(cmd)
        sys.stderr.write(msg)
  sys.stderr(the_exception.output)
  
CHECKS_TO_SKIP='check99,check22'
file='/path/to/file'
run_("pylint --py3k --disable=%s %s"%(CHECKS_TO_SKIP,file))

Pylint is kinda like having unit tests, in that you should understand the problems it calls out and either fix them or skip the checks and kinda not in that unlike 100% test coverage, you will miss some problems.

ACKNOWLEDGMENTS

FUTURE

For simplicity, the code examples here are stripped of useful but distracting bits like logging and conversions that 2to3 doesn't handle. Eventually, I'll strip out my employer's proprietary things and put the full script on GitHub. Sooner rather than later, if somebody shows an interest in the fuller code.

Friday, June 21, 2019

SOLVED restore chef nodes, clients, vaults from knife-ec-backup


RESTORE A CHEF VAULT ITEM

A vault item is made of two data bags.

   Data bag 1 is: VAULTNAME
   Data bag 2 is:  VAULTNAME_KEYS.

Restore the data bags and you restore the vault item.

Restore  just (1) data bag and you break the entire vault so nobody else can use it until you restore the other data bag or delete your first data bag. The general process is:
  1. Make a copy of the chef backup 
  2.  cd to the directory corresponding to the organization and vault you want to restore
  3.  run the knife commands to restore the data bags
cd ~/backup_dir/organizations/your-org-dev/data_bags/certs/
knife data bag from file certs root_cert_wrtc_your-orgdev_com.json --config-file ~/.chef/knife.rb
knife data bag from file certs root_cert_wrtc_your-orgdev_com_keys.json --config-file ~/.chef/knife.rb

RESTORE A CHEF CLIENT


The general process is:
  1. Install the 'jq' JSON manipulation tool
  2. Make a copy of the chef backup 
  3. cd to the directory corresponding to the organization you want to restore a client to
  4. Extract the public key from the backup file
  5. Run the knife client command to create the client

sudo apt install jq
cd ~/backup_dir/organizations/your-org-dev/clients
jq .public_key .json > client-one.example.com.public_key
sed -i 's/\\n/\n/g' client-one.example.com.public_key
knife client create client-one.example.com --public-key client-one.example.com.public_key

NOTE: There is no obvious way to run 'knife client create' without it popping up an editor

RESTORE A CHEF NODE


The general process is:
  1. Make a copy of the chef backup 
  2. cd to the directory corresponding to the organization you want to restore a node to
  3. Run the knife node command to re-create the node
  cd ~/backup_dir/organizations/your-org-dev/nodes
  knife node from file EDV_omacneil_1558372161_AWS.fmr.com

Sunday, January 31, 2016

Follow technology dream, slightly un-break the world , accept some regret

Over 15 years , I made a few sacrifices to follow my tech dream to make the world slightly better place. After 4 years of reflection, I have some satisfaction and some selfish regrets. Click bait but true, the selfish regrets probably aren't what you'd think.

DISCLAIMER: There are at least 20 people I don't acknowledge here. Without you accepting the legal responsibility of board membership, making generous, sometimes sacrificial contributions of resources and/or providing the political and fund-raising advice I needed to keep our nose above water, this work wouldn't have happened. You deserve acknowledgement. However, I took 3 years to write what is here. Any more delay and this note will never be finished. I hope you know who you are and know that that I am grateful for your help.

Confirmation bias is a thing. If you repeat, over and over: "At least I tried", you can tell yourself: "yes, it was worth it."

If you believe:

 "Do not store up for yourselves treasures on earth, where moth
  and rust destroy, and where thieves break in and steal. But 
  store up for yourselves treasures in heaven, where neither 
  moth nor rust destroys, and where thieves do not break in 
  or steal; for where your  treasure is, there your heart 
  will be also.

...Earning about $7,000 per year for 15 years of tech support, system administration, cat herding, tutoring and software engineering is just fine.

So that's me with a spouse that paid the rent so I could be holier-than-thou for 15 years. Compared to the quiet guy who has run a good after school program in the same notorious low income neighbourhood for 40 years, I have few, legitimate self-righteous points.

Not maximizing my income was a means to the end of my dream of using technology to make a better world at the Community Software Lab (RIP). However, being (somewhat) low income had interesting side benefits. For example, these last 4 years at market rate, our income more than doubled. However, it ***feels*** like we've only been living only slightly large. Dinner and drinks a few times a week vs just coffee at the local shop. Running outside in the winter vs Yoga 5 days a week. Perhaps, the feeling of living large these past 4 years is muted by paying debts and starting to save toward old age.

Again, confirmation bias, but past an income floor of < some number of $ >, limiting desires is probably a better strategy for material happiness than increasing income. We still have credit card debt, but instead of paying to attend funerals on the other side of the country, we're paying off a winter beach vacation.

In my limited experience, subjective perception is better predictor of satisfaction than empirical measurement. Surrounded by middle income family, acquaintances and in-laws the expectations of our peers was always a greater pressure than the cost of adequate housing, food or medical care.

When I was a practising Catholic, I did informal tech support for the Oblates of Mary Immaculate. If lunch-time came in the middle of work, we had lunch together. I have a sense for how these guys live. Everyone has health care, an allowance for new uniforms, meals, a computer a bed room and a bathroom to themselves and a spot in the retirement home when the time comes.

The rooms were a little smaller than your average Motel 6, but a little less shabby. I think they got $20 per week in spending money. If they needed a car for work they submitted a budget. Oblates working over seas live even simpler, closer to the people they serve.

The Oblates have infrastructure and a supporting culture. It didn't seem like honouring the Oblate vow of poverty was a significant struggle. My spouse and I struggled in our unstructured time of relatively simple living. I think our struggle was more about being embedded in the culture of money than it was being able to pay our bills.

Considering only the money issues, for at least me, autonomy and the chance to make a difference, were worth not maximizing our income. However, for different selfish reasons, given a do over, I would do things differently.

The last 8 of 15 years, my focus was using novice programmers to build useful software for low income people in a organization I started called the Community Software Lab. Three lab alumni work at Google, Amazon and Microsoft. More significantly, three different people explicitly credit the Community Software Lab for significantly better opportunities and lives than they would have had otherwise.

Despite occasional flashes of student brilliance and the many personal indulgences granted me by teachers and educational bureaucrats, I've long considered the industrial/educational complex ineffective and unjust. The lecture system was created in medieval times because it is quicker to read a book aloud to 50 people than it is to hand-write 50 copies. Adjunct professors do almost all the revenue producing teaching but those without independent means are eligible for public assistance. (about 25% of them)

Writing software that people use, following a modern process is a better way to learn to program than listening to lectures or copying your roommate's homework. Requiring 100% test coverage for new code, creating detailed engineering documentation, tools , standards and processes and (most important) re-re-doing line by line code review, allowed people starting with only the ability to write a FOREACH loop to contribute production code and gain real skill. By unanimous testimonial, working with me was hugely more educational than undergraduate computer science education.

This is all anecdotal, but I challenge any CS dept to provide better outcomes with their standard program than I could locked in a room with five CS 101 grads and their tuition for 4 years.

The problem is that most people don't have the drive to write software. They want a paycheck or the entitlements accorded to the coder caste. They don't want to stay up until 3:00am, failing again and again, debugging for joy or compulsion. I helped hundreds of people confirm that they really didn't want to be engineers. The educational industrial complex has the advantage of not investing much emotional energy helping people accept their failure.

Over 8 years, of the approximately 200 people who I setup development environments for, 10 people created more value (useful code) than I could have created by spending the setup/ tutoring/ code review time just writing code myself. More people personally benefited from the experience, than contributed more than they took, but wasn't the greatest good for the greatest number.

Given only 1 chance to repeat the last 8 years of volunteer experience, (previous 7 are another matter) I'd ignore everyone else and lock myself in a room to code. Crappy as it was, our software did refer tens of thousands of people to the social services they needed. Not-crappy software would have been a far greater good for a far greater number. Certainly more the (at best ) 25 people who got to be better coders from their work with the Lab.

Compounding the uncertainty is my suspicion that most of our "successful" alumni would have made it anyway. The MIT dropout, yes. The guy with 10 years industry experience waiting out a recession, yes. The driven, angry, smart kid who'd lived half his life in foster care, probably.

Despite our best efforts, most of the people we gave practical experience to weren't especially dis-advantaged. Our usual admissions test was "Use Javascript to prompt the user for a positive integer, print the numbers between 1 and that integer". In theory this bar is low enough for anyone pass, in practice only 2 of the 10 productive people came from a low income background or with a label like "at risk youth" We probably accepted a good number of people without advantages, but subjectively, the big service we provided most of them was helping them accept that they didn't like coding.

Selfishly, given 8 years of effort, I wouldn't be the mediocre programmer I am today. If instead of explaining the same things over and over, I'd written and re-written code, I'd be pretty good, or at least better. It felt horrible to blow the recruiter level phone screen at Facebook. It felt worse to be completely oblivious to the sql INNER JOIN syntax on a web dev interview for a spot on a team full of people as smart or smarter than me. (FWIW, I am still reasonably employable at a few levels below elite. )

In both interviews, I missed things because I'd been focused on explaining existing code and making it hard for people to screw up new code instead of writing new code or updating code myself. Building skill is a different than helping other people learn. Also, usually being the most knowledgeable person in the room is not a recipe for personal growth.

For example in 2006, it was an acceptable practice to connect database tables in the SELECT clause. By my interview in 2013, modern databases supported JOINs and modern programmers used them.

This strong regret was obvious only in hind-sight, after I'd been away from the lab for a few years. In the moment, it was mostly great. I like Yoga a lot partly because the instructor usually feels compelled to thank the class for the privilege of sharing the skill. It feels very, very good to contribute to other people's ah-ha moments.

People talk about engineering education a lot like they talk about baseball. At the little league level there is lots of fun, but eventually the grind sets in. Maybe the grind is the price of greatness. Maybe the grind is yet another black mark for the educational/industrial complex. Having 3 people get jobs at famous companies, was OK. More satisfying than alumni success was watching a women, a serious A+ student, pound her heels, shake her fists and giggle after fixing a complicated bug.

Overall, these moments of joy didn't add up to what might have been with a more selfish approach.

--- or maybe a more effective approach ?

It's kinda obvious, but the reason only about 10 people gave more than they took and (guessing) only about 25 people benefited from the practical experience was that not many people put the needed time in.

Without exception, people who dabbled, trying to fit our work in between school or other jobs or being a mom, were a net loss. Some of those people were quite talented but just couldn't find the time.

The people that succeeded put in at least 20 hours a week and most of those people got paid a small stipend, --Often below minimum wage, but something.

Most of the people who succeeded, were Americorps/VISTAs, The government paid them 16K per year and basic health benefits. The program was created by President Kennedy to fight poverty through (almost) voluntarism and required a full time commitment. Other people got funded through Google Summer of Code. We got some contracts. I was able to fund-raise to pay other people. I did other completely unrelated contract programming work that paid relatively well.

This is an insight for other groups trying to skill-up low income or disadvantaged people. "Work for free" is not perceived as a survival skill by low income people or entitled feeling middle class people. There is an almost infinite distance between "almost nothing" and "free" for both producers and consumers. Many people needed the stipend to pay their rent. Other people just felt exploited working for free ---Even if the "work" was mostly training that they'd otherwise have to pay for.

The ideal would have been to have a summer camp with computers , Internet access, showers, a silo of rice and a silo of beans. Given that I spent about half my time trying to fund-raise as a charity and barely kept the doors open with an annual budget of around 20K , this was probably not a reachable ideal.

Given more than (1) chance for a do-over, I might just have worked as a market rate programmer, banked the money for 8 years and retired early to spend the banked money on stipends. ( Assuming my wants didn't expand to match my earnings )

The biggest sacrifice wasn't money, but personal growth. The biggest barrier to most people's personal growth was a little money. There is a difference between something and nothing. So there it is. Money isn't important, except when it is.

I've covered the major regret ( lack of personal growth). A minor regret is lack of measurement. If I'd taken the time to better define success in advance, I might have changed course earlier and I might have fewer doubts now

Disclaimer: The first version of our (MVHub) software was written by DS and EA in about 8 months working by themselves. During their work I was mostly focused on system administration. ---providing email, web hosting and databases to non-profit organizations.

Sunday, September 7, 2014

POP3 gets academic / university / office 365 into gmail

I use Gmail.[*] Part of the cost of a night course at the local Uni is agreeing to use their email, which uses Microsoft's Outlook.com  Fortunately, Gmail can be configured to pull email from other systems.

The winning search term is POP3 academic office 365.  If you know what POP3 is, all you need is that  POP3 server info:


Incoming POP3 server: outlook.office365.com, port 995, encryption SSL

If you don't know what POP3 is, proceed as follows. You will need:

  1. Your student email username
  2. Your student email password
  3. Confidence that your university is using Academic Office 365  or Outlook.com
  4. a gmail account. 
Given the above:

Login to Gmail and access settings:


















 

 

Choose accounts


Add POP3 account you own











 

 

Add your email address

Set username (probably your email address) , password, POP3 server (outlook.office365.com) , port (995) and encryption


Finish (no to send outgoing mail as your student account)


Possible Problems / sources of errors

  1. Don't include any invalid letters like a comma  in the POP3 server name 
  2. Do get your username correct. (probably your email address)

Bigger issues

Why do many students have to deal with this multiple account hassle instead of 1 person at the university provide good service in compiling a directory of existing emails  ?

[*] albeit with Icedove

Sunday, August 3, 2014

HOWTO backup mediawiki for easy restore on debian wheezy/squeeze

Creating a useful backup for a a web app like MediaWIki is somewhat more complicated process than backing up somebody's home directory.

While there is no warranty, I have tested this script on the Debian GNU/LINUX Wheezy release. Tonight I used it for reals to move a MedaiWiki on dead Wheezy server to a living Squeeze server.

Assuming you have a working, somewhat standard MediaWiki setup on Debian Wheezy (and probably squeeze), the script below creates a complete tarball backup of your Debian Wheezy MediaWiki install, including:
  1. Database 
  2. MediaWiki configuration
  3. PHP configuration
  4. Apache configuraiton
  5. README.restore , containing complete instructions on restoring your MediaWiki
By default, the script deletes backups older than 90 days, Edit the script and change the variable: delete_after_days if you don't like this. 

  To install / test install:
  1. copy the script below to /etc/cron.daily/backup_media
  2. edit the script to change the config variables (probably just mysqlopt to set mysql root password)
  3. make the script executable
  4. make the script readable only by root
  5. run the script.
  6. untar the tarball in /var/backups/mediawiki/
  7. examine the tarball contents, in particular README.restore

#!/bin/sh

# based on http://www.mediawiki.org/wiki/User:Megam0rf/WikiBackup

########### start configuration variables#########

# You need to change this:
mysqlopt="--user=root --password=your-mysql-db-root-password"     #  user/password to run mysql dump as

# you might want to change this:
delete_after_days=90                                              # how long to keep the backups

# you probably don't need to change these:
now=`date +%Y-%m-%d-%H_%M-%a`                                     # used to give you timestamped backup files, 
backupdir=/var/backups/mediawiki                                  # the directory to write the backup tar file to
dump_dir="$backupdir/mediawiki_backup.d/"                         # directory to hold backed up files
db_dumpfile="$dump_dir/mediawiki-restore.sql"                     # database dump to this file
tarfile="$backupdir/mediawiki-backup-$now.tgz";

########### no more configuration variables below #########

mkdir -p $dump_dir || exit $?

# dump the database
mysqldump  $mysqlopt wikidb  | gzip > "$db_dumpfile.gz" || exit $?

cp -ar /etc/mediawiki $dump_dir/etc-mediawiki                       || exit $? 
cp -ar /var/www/mediawiki_images $dump_dir/var-www-mediawiki_images || exit $?
cp -ar /etc/cron.daily/backup_mediawiki $dump_dir/                  || exit $?


cat << 'EOF' > "$dump_dir/mediawiki-setup.sql"

 -- #################         start HEREDOC for mediawiki-setup.sql" #################

START TRANSACTION;
  CREATE DATABASE wikidb;
  USE wikidb;
  CREATE USER 'wiki'@'localhost';
  GRANT DELETE ON wikidb.* TO wiki;
  GRANT SELECT ON wikidb.* TO wiki;
  GRANT UPDATE ON wikidb.* TO wiki;
  GRANT INSERT ON wikidb.* TO wiki;


  -- CHANGE mysql 'password'  (below) to password you configured 
  --  Mediawiki to connect to database with see: 
  --      grep wgDBpassword  etc-mediawiki/LocalSettings.php

  SET PASSWORD FOR  wiki  = password('password');
COMMIT;

  -- #################         end HEREDOC for mediawiki-setup.sql"  ################# 

EOF
 
cat << 'EOF' > "$dump_dir/README.restore"

#################         start HEREDOC for README.restore  #################

# To restore on Debian wheezy  run commands below

# Assumptions are that you:
 #   use mysql / apache 
 #   will handle Dns on your own.
 #   know what the root mysql password  is on your system

sudo apt-get install php5-mysql mysql-server mediawiki mediawiki-extensions-base  mediawiki-math
sudo apt-get install mediawiki-extensions-confirmedit mediawiki-extensions-openid 

 # follow comment about changing password
 # re-discover password configured mediawiki to connect to mysql with
 
       grep wgDBpassword  etc-mediawiki/LocalSettings.php

 # change script to use that password for the mysql user 'wiki'
 sudo $EDITOR mediawiki-setup.sql

 # unzip the database backup
 gunzip mediawiki-restore.sql.gz

# create the  dataase and database user
 mysql -u root --password=the-password-for-wikiuser       < mediawiki-setup.sql

 # restore the database
 mysql -u root --password=your-mysql-root-password  wikidb < mediawiki-restore.sql

 # copy the files
 sudo mv /etc/mediawiki            /etc/mediawiki.original
 sudo mv etc-mediawiki             /etc/mediawiki
 sudo mv var-www-mediawiki_images  /var/www/mediawiki_images
 sudo mv backup_mediawiki          /etc/cron.daily/

 # replace the default apache conf file  for mediawiki with yours
 sudo rm /etc/apache2/conf.d/mediawiki.conf
 sudo ln -s /etc/mediawiki/apache.conf /etc/apache2/sites-enabled/wiki.conf


 # set the mysql root password  and other parameters you care to change
 sudo $EDITOR /etc/cro.daily/backup_mediawiki

 #################         Stop HEREDOC for README.restore  #################

EOF

cd $backupdir
tar -zcf $tarfile mediawiki_backup.d/ || exit $?
rm  -rf mediawiki_backup.d/ || exit $?

/usr/bin/find $backupdir   -type f -mtime +$delete_after_days -exec rm {} \;

############   END  of  backup script ###############