Friday, October 25, 2019

ATO 2019 - Inclusion event (a report)

This was the second year that ATO hosted a pre-conference track on diversity and inclusion. It was a sold out event with a free but separate registration (for booking, budgets, and accounting). I attended last year as well.

As I began writing up this report, I noticed the title of the event does not include the word diversity. According to the wayback machine, the main title was the same last year but it felt like the word diversity was included in most of the promotion of the event. Last year did have "A Conversation" as part of the title and incorporated much discussion on the definitions and differences in diversity, inclusion, and equity. This year the title was simply Inclusion in Open Source & Technology [1] and the presentations had a lot more actionable examples of how a project, organization, team, or individual can be more inclusive.

I really like the format of this event. They have a series of short talks which this year were basically people's stories of how they felt included or actions they thought there should be more of so others feel more included. Later there is a Q&A session for everyone to further explore these topics and suggestions.

This year also included a screening of the second episode of the Chasing Grace Project and a Q&A with the producer. I cannot seem to remember which event I was at when I had the opportunity to screen the first episode. I am looking forward to the complete series being available to a wider audience.

Last year I remember feeling a mix of depression and optimism. There were a lots of examples showing how those paying attention have expanded the types of diversity beyond gender and race and how many opportunities do exist. There were also a lot of stats showing how slow the progress is happening and where it is even going backwards. In many ways I felt like I was hearing the same things I've heard all my life and that is a tiring thought.

This year was, at least for me, a lot more positive. I think mostly because the discussions were not so much around statistics and abstract items which still need to be done, but rather a lot of examples of activities that have helped and could help:

  • The young high school student asked for more everyday roles models like parents and teachers sponsoring club activities. Representation at the C-level is important but not as important has having someone in room learning technology along side the students.
  • The older but not ready to retire gentleman reminding people that having had to change technologies so many times, older people bring a lot of experience and can still learn new things - sometimes even learning faster. Most of us also accept (even enjoy) being managed by more youthful enthusiasm as long as we are not just dismissed as a dinosaur. 
  • The consultants that help D&I committees  proactively create company communities and both networking and educational opportunities. 
  • The examples of how to reach out of your comfort bubble, grow your own network, and be an ally.
I came away reminded that I am where I am and still an Open Source consultant and educator because the of the welcoming and supportive people I have gotten to work with. People who treat other people as people. People who can work as part of a team. People who want to do the right thing and give the right people the credit they deserve. These people were rarely official mentors and many have never thought of themselves as an ally but by being good humans, they were an ally to me.

The little things matter. They matter when they produce the thousand paper cuts that drive people away. They matter when they appear from an ally and encourage inclusion.

-SML


[1] Note: at the time of writing the URL for this event was for the current year. At some time in the future it may be replaced with the next year details. I do not know if it will be archived. I was able to submit the page to the wayback machine.

Thursday, October 24, 2019

ATO 2019 - an event report

ATO 2019 was a good year.

For a number of years now, each October, thousands of technical folks converge in Raleigh for All Things Open. The "all things" includes a lot of developers talking about opensource platforms, tools, stacks, and applications but it also includes topics on open hardware, open government, open education, and building communities in addition to projects and products.

For a couple of years, I felt there was too much of a programmer focus for me and I wasn't finding new things in the community tracks. It is local though and so with expectations set, I continue to support a great  conference and enjoy the hallway track with a number of people I "see" mostly online even though I was not previously finding a lot of talks for my sysadmin or infosec interests.

I know several local people that have not attended the past couple of years because of this trend and I bring it up because this year was a bit different. While I attended expecting to once again content either repetitive (of other years and other conferences) or too dev focused, I was pleasantly surprised. There were full tracks both days for Security and Linux/Infrastructure. [1]

I attended a few of the security sessions, two that stood out were:

Prepping for the Zero Day Attack 
Eric Starr discussed a CI/CD pipeline that includes checking for vulnerabilities with both source code analysis and container scanning. He shared experiences where unit tests were disable "to speed up the deployments" which later turned into disasters. He was practical in his approach where some of the scans take hours to run. If the deployment or test cycle is shorter than a day, maybe those scans get run daily instead of with each change but do NOT eliminate them just because they take too long! He mentioned tools that work for his project but regularly pointed out what type of tool it was and that the specific tool used is not important. I would add that the best or right tool is any one you will use though you may be limited by what will work in your environment.

Insecurities and Vulnerabilities: How to Keep the National Vulnerability Database Current
I really enjoyed this one! Rob Tompkins shared his experience reporting CVE as part of an opensource project security team. When I teach about tools such as openscap and Red Hat Insights which include information from the NVD and then suggest remediations, it is helpful to understand how the information gets into this database. This example along with a talk from OSCON years ago about reporting embargoed security issues helps me also explain how an administrator should go about reporting a suspected vulnerability with correct documentation. This is a topic I am now adding to my "write and article on this" list.

Next door, at the Linux/Infrastructure room, by title, I would be interested in Getting Started with Flatpak and possibly Platform Agnostic and Self Organizing Software Packages . Also the What You Most Likely Did Not Know About Sudo…  and maybe the Terminal Velocity: Work faster in your shell  talks.

With these tracks, I would encourage a few of my more "Ops" friends to rethink attending this conference, especially if they are local to the area. I also have some new ideas for articles to write and possible presentations at future events.

Oh, they also have great book signings scattered across both days!

-SML

[1] Note: at the time of writing the URLs for the tracks were for the current year. At some time in the future these will be replaced with the next year tracks. I do not know if they will be archived. I was able to submit the parent tracks page for the wayback machine.

Wednesday, October 23, 2019

Writing Summary - late summer 2019

I've done some (ok, very little) writing for opensource.com in the past and I still have some notes for more articles that keep getting pushed aside. This site is almost 10 years old, community driven (with Red Hat Sponsorship), and tries to cover a variety of open topics, products, projects, and distributions.

This summer, some of the staff from that project switched over to help Red Hat start a new blog for system administrators called Enable Sysadmin. As the name implies it is focused on system administration topics and as a corporate blog it can also be a bit more Red Hat product specific. In addition to a small staff, a few part time contractors, and a number of Red Hat employee contributors, they do accept and encourage community contributions.

I have enjoyed being one of the early authors. Of course, like all my writing projects, I have plenty more ideas in my head and not enough focus to get them organized in a timely manner.

So far I have written two articles about using SSH keypairs, two articles about SELinux, and a short article about cybersecurity awareness month.

How to manage multiple SSH key pairs

Passwordless SSH using public-private key pairs

Accessing SELinux policy documentation

Four semanage commands to keep SELinux in enforcing mode

Security advice for sysadmins: Own IT, Secure IT, Protect IT

-SML

Friday, June 28, 2019

Red Hat Summit 2019: My notes

My notes from sessions at Red Hat Summit 2019 are for my reference and as documentation for any submitted continuing education credits.

The Ansible party was awesome as usual (even if was a part of the Smart Management party). Great food at Legal Harborside with lots of people I wanted to see.

I'm glad I watched most keynotes remotely. The one I did attend in person reminded me of how cold that space is and how many people where chemical scents that trigger my asthma.

Ran into more cool people at the Red Hat Women’s Leadership Community Luncheon.

Remaining notes include sessions attended as a reminder of which slides or videos to reference for more details as well as topics, commands, and keyword to dig into in the future.

Keynote recordings are available in the YouTube channel.

Session descriptions have links (where available) to slide decks.

On Demand session recordings require a login.

5/7: Red Hat security roadmap : It's a lifestyle, not a product

  • Speaker: Mark Thacker, Red Hat
  • Slides available
  • Recording available

5/7: The current and future state of security: A discussion of security challenges (Birds of a feather)

5/7: Successfully implementing DevSecOps: Lessons learned

  • Speakers: William Henry, Red Hat; Deven Phillips, Red Hat, Inc.; Lucy Kerner, Red Hat
  • UBI - Universal Base Image
  • https://github.com/rht-labs/labs-ci-cd
  • Case Study: Homeland Security in Innovative Labs
  • Look at pipeline box on Heritage slide.

5/7: Security: Emerging technologies and open source

  • Speaker: Mike Bursell, Red Hat; Nathaniel McCallum
  • Slides available
  • Recording available

5/8: Top 10 security changes in Red Hat Enterprise Linux 8

  • Speaker: Mark Thacker, Red Hat
  • Slides Available
  • Recording Available


5/8: Security and compliance automation: Demos of current capabilities and future technologies

  • Speakers: Shawn Wells; Chris Reynolds, Red Hat; Gabriel Alford, Red Hat Inc
  • Included pipelines with Ansible Tower, SCAP, and Open Controls.

5/9: Red Hat on Red Hat: Transitioning Red Hat IT to hybrid cloud infrastructure using OpenStack and Ceph Storage

  • Speakers: Brian Atkisson, Red Hat, Inc.; Matthew Carpenter, Red Hat, Inc.
  • Slides Available

5/9: Evolution of a Linux system identity and authentication stack

  • Speaker: Dmitri Pal, Red Hat, Inc.
  • Slides Available

5/9: A practical introduction to container security using CRI-O (LAB)

-SML

Tuesday, March 5, 2019

20 Years Ago: Remembering my first RHCE exam

When, where, and why

I learned about network operating systems working in a person computer (PC) helpcenter. After several years talking to customers and being an escalation point for other support engineers, I moved into the training department.

I was responsible for training the technicians using both in house written materials for our own hardware products and partner materials for most of the software. When I started working with Linux, I was already teaching official material for OS/2 Warp Server and SCO Unix in additional to some material for Microsoft, Banyan Systems, and Novell products.

When IBM invested in Linux, my position in a training department and my Unix background made me a lead on the team facilitating a plan to get the PC Helpcenters around the world up to speed in supporting four Linux distributions. We needed to quickly ramp up on Red Hat, Caldera, SUSE, and TurboLinux. I became the point person for Train-the-Trainer sessions on all four distributions in addition to having the responsibility of getting the North America support center trained.

The Red Hat office, with their brand new Training and Certification team, was located halfway between my office and my home so of course I started by attending their Red Hat Certified Engineer course. That was 20 years ago. To be precise, it was Mar 1-5, 1999.



The course

The course was similar to other technical training course which I had attended and taught. The RH300 RHCE Course was new and it was the only course offered at the time. In those years, the authors and the instructors were the same people.

It was a lot of information for a single week and it required a good foundation of prerequisite knowledge. Some things have not changed in 20 years!


The exam

I could not find the course or exam descriptions from 1999 but the Internet Archives Way Back Machine does show the spring of 2000 Prep Guide: https://web.archive.org/web/20000407183013/http://www.redhat.com/services/training/training_examprep.html

My memory is that most items did not change much in that first year.

The format:

Like now, the the exam was on that Friday. It was mostly hands on but not entirely since the first iteration included a multiple choice section. It was three parts:


  • Installation Lab Exam - 2.5 hours
  • Written Exam - 1 hour
  • Debug Lab Exam - 2.5 hours
  • PASS requires: avg of 80 or higher, with no single score lower than 50 pts.


I think my class did the installation in the morning with debug in the afternoon. The written section was in between followed by a lunch break. Soon after, the lab sections were swapped out in the daily schedule. I suspect it was quicker to grade the debug than the installation lab and systems needed to be reset for each section. My class did not need a reset since we took the exam on zip drives. Yup, you heard that correctly, ZIP drives. They were removed, labeled, and graded later. The email with my score report is dated March 19th. Two weeks later!


The objectives:

Objectives that are still seen today included:

  • Red Hat installation and network configuration
  • filesystem layouts and user management
  • boot issues and boot loader options
  • package management and automated installation
  • various network service configuration and security

Of course at that time, it was network-scripts files, ext2, NIS, LILO, rpm, squid, tcp_wrappers, and ipchains instead of the nmcli, xfs, GRUB, yum, systemd, and firewalld.

There were also some "get off my lawn" objectives:

  • We had to understand XFree86 configuration and get graphical login managers working
  • We used boot floppies for rescue. Yes, Floppies!
  • And we had to "be able to configure, build, and install the Linux kernel and modules from source".

The version:


  • My exam in March 1999 was on Red Hat Linux 5.2.
  • GNOME was technology preview and the kernel was 2.0.36.
  • A month later Red Hat Linux 6.0 was released.
  • The 2.2 kernel and glibc 2.1 were major new features and GNOME was the default GUI.


What Came Next

I became a Red Hat Certified Instructor through a partner program and immediately started sharing the wonders of Open Source with a whole new set of users. I added Red Hat Certified Examiner credentials a year later.

The first class I taught using Red Hat Training materials was written for RHL 6.0 and delivered in Scotland. That, though, is a story for another article.

Friday, March 1, 2019

FeBRRRRuary reading and writing

Stuff I wrote:

Getting started with Vim visual mode at opensource.com turned out to be very popular.

Fedora IoT Docs are Live for the Fedora Community Blog is a summary report for the majority of my February writing.

A technical reboot, recharge, upgrade, and expansion of the Fedora IoT Documentation took up most of my time and provided an opportunity to spend the dreary month working from home instead of commuting.

New bookmarks:

Specifically for the Fedora IoT project, I did a lot a reading in the month as well. Here are a few of the items I bookmarked. Some inspired my hacking and writing, some are saved for future adventures:

Getting to Know Fedora Silverblue

Raspberry Pi improvements in Fedora 29

Fedora IoT with Peter Robinson | OpenHours ep 133 video from 96Boards Open Hours.

How to turn on an LED with Fedora IoT

Turn a Raspberry Pi 3B+ into a PriTunl VPN

Set the holiday mood with your Raspberry Pi

How to build fully automated musical lights [Halloween/Christmas]

Home Assistant Installation on Docker.

Mozilla IoT Gateway


Sunday, January 14, 2018

Prime - not impressed

I'm a Whole Foods junkie. Or was, depending on if you still consider it Whole Foods or if you already call it Amazon Foods. It started before Whole Foods really with family and food allergies but as my allergies got worse it has become a life saver when traveling. I knew I had a place where the staff was trained to answer questions about the ingredients and most things are very well labeled. People have said to me, "Oh we have a Wegmans" or "we have a Trader Joes" and these chains are useful in frozen dinners or raw ingredients and they do have prepared food. I have check out other local recommendations too as I have traveled but so far no one has done as well as Whole Foods in having the labeling and variety for picky (choice or allergy) eaters like me.

Sure, they have specially items, many of which are expensive and only for the privileged. And prepared food is more expensive anywhere but even fancy organic Whole Foods hot bar is less than most room service when traveling. Also, when you eat all clean food, all the time, you actually can get a lot more nutrients and that full feeling with a lot less quantity. My family has tracked the budget.

We use a lot of the 365 Brand items but mostly we cook simple meals from scratch so we have a lot of fruit and veggies.  We have a garden, we shop at the farmers market, and we supplement from the local coop and from the Whole Foods.

All this is background leading to the my skepticism with the Amazon acquisition of Whole Foods last year. While hoping that the larger buyer could make better deals in price and worrying about what it will mean for the employees (who appeared to be getting a fair wage and like working in the store so far) in the future, I also wondered (still wonder) what it will mean for me.  So far I have not been impressed. The items with lower prices have also been lower quality and the variety of products, especially in the category of more expensive but tastes great allergen friendly ones, is slowly dwindling.

Once Whole Foods + Amazon was up and running, I looked to see about getting some items delivered straight to the house. Most of the items I want are in Prime Pantry only so you have to have Prime which I didn't have it. Yet. Amazon is also beginning to offer some deals in the store too if you have Prime such as a substantial discount per pound on a Fresh Turkey at Thanksgiving.

So I was willing to check it out. I had three things to investigate with Prime - shipping in general, prime pantry, streaming videos. So I set up the free trial.

I knew that starting at the end of the holiday shopping season would have its own issues but I wanted to see how it worked in my area.  I live in a rural area with the mailbox almost a mile from house so I use a PO box in town for most mailings. I hoped that the Prime 2-day shipping would use carriers that delivered straight to the house. Unfortunately, Amazon uses a lot of USPS and if the package is small enough to fit in the box (and we have a package box out there too) it gets left way away from the house. I might as well continue with the shipping to the PO box so items are secure and dry. I rarely need something right away so waiting to order when I have enough for standard free shipping works for me.

So what about streaming videos? I found a few things not in Netflix that I might watch again but mostly they don't have anything better and I find myself back in Netflix most times. I'll take advantage of it if I have it, but I don't see any value in getting Prime solely for this feature.

Prime Pantry is interesting and if I did not live close enough for a weekly drive to an actual Whole Foods store, I might consider it. The first time I looked, they didn't have much. The next time I looked, there were enough items for me to fill a box if I needed them. I also checked and yes, the prices are exactly the same online as in the store. Then last week I decided to actually give it one try before my trial ends but 9/10 items I would get were out of stock and unknown availability.  That rules that out. I was already leaning to a no. It is $6 per box for delivery on top of the monthly (or yearly) fees for Prime. I can drive about 20 miles to either of the nearest Whole Foods for less and find more of the items on my list plus all my other groceries.

I do not live in an area of Prime same day shipping -  though I am only 25ish miles from a fulfillment center - so 2 hour or Prime Fresh deliveries are also completely out and not even on the list possibilities anytime soon. If I lived in one of those areas or had a larger family to order items for or just shopped online more, it might be worth it. But so far, for where I live and what I buy, I am not impressed.

The cancellation requires what seems like a gazillion times confirming you want to cancel as they remind you of all the benefits of staying but I finally got through it and have ended my Prime trial.

-SML