About a week ago I finished reading The Clean Coder: A Code of Conduct for Professional Programmers by Robert C. Martin. Here are some of the quotes I relate or related to in the past, the book is a summary of do's and don'ts with anecdotal references to get you in the mood, it's very light reading and can get done with quickly.
Here are some random ramblings about the importance of unit tests, which are rather obvious:
Why do most developers fear to make continuous changes to their code? They are afraid they’ll break it! Why are they afraid they’ll break it? Because they don’t have tests.
This quote affirms some discussions and agreements I have had with some colleagues, you may have or may not have the time, but training is a personal goal, hand holding may be something your employer helps you out with, but is not a requirement to stay relevant
It is not your employer’s responsibility to train you, or to send you to conferences, or to buy you books. These things are your responsibility. Woe to the software developer who entrusts his career to his employer.
Perhaps you think that work should stay at work and that you shouldn’t bring it home. I agree! You should not be working for your employer during those 20 hours. Instead, you should be working on your career.
Professional programmers practice on their own time. It is not your
employer’s job to help you keep your skills sharp for you. It is not
your employer’s job to help you keep your resume tuned. Patients do not
pay doctors to practice sutures. Football fans do not (usually) pay to
see players run through tires. Concert-goers do not pay to hear
musicians play scales. And employers of programmers don’t have to pay
you for your practice time.
This is a quote I've been using often enough and was glad to see it mentioned in the book,
Remember Santayana’s curse: “Those who cannot remember the past are condemned to repeat it.”
This is rather important, do your job well by giving the necessary detail to your managers for them to make an informed decision and avoid delving into the details that only matter to you
Providing too much detail can be an invitation for micro-management.
This often happens in the early stages of a career when you think you are a hero and with some extra effort anything is possible, much better stick with reality and read a fantasy novel to imagine the impossible :-)
Hope is the project killer. Hope destroys schedules and ruins reputations. Hope will get you into deep trouble.
This is good overtime avoidance advice I'll be taking for future tasks
If your boss cannot articulate to you what he’s going to do if the overtime effort fails, then you should not agree to work overtime.
I like this analogy on knowing your tool set to make the actual objective of your job more straightforward
Consider a guitarist like Carlos Santana. The music in his head simply comes out his fingers. He does not focus on finger positions or picking technique. His mind is free to plan out higher-level melodies and harmonies while his body translates those plans into lower-level finger motions.
This game is something I look forward into doing, seems learning can spike with these methods
Simulated combat does not map well to programming; however, there is a game that is played at many coding dojos called randori. It is very much like two-man wasa in which the partners are solving a problem. However, it is played with many people and the rules have a twist. With the screen projected on the wall, one person writes a test and then sits down. The next person makes the test pass and then writes the next test. This can be done in sequence around the table, or people can simply line up as they feel so moved. In either case these exercises can be a lot of fun.
Regarding tools and my affinity with opensource and things that make sense where managers like the crafted marketing bullets out of a commercial tool, there can be exceptions though, so don't blindly follow suit with this
When it comes to source code control, the open source tools are
usually your best option. Why? Because they are written by developers,
for developers. The open source tools are what developers write for
themselves when they need something that works. There are quite a few
expensive, commercial, “enterprise” version control systems available. I
find that these are not sold to developers so much as they are sold to
managers, executives, and “tool groups.” Their list of features is
impressive and compelling. Unfortunately, they often don’t have the
features that developers actually need. The chief among those is speed.
The eternal battle and inconsistence between developers and requirements and on how volatile everything is and how drawing the line is a timed challenge
One of the most common communication issues between programmers and business is the requirements. The business people state what they believe they need, and then the programmers build what they believe the business described. At least that’s how it’s supposed to work. In reality, the communication of requirements is extremely difficult, and the process is fraught with error.
Developers, too, can get caught in the precision trap. They know they must estimate the system and often think that this requires precision. It doesn’t. First, even with perfect information your estimates will have a huge variance. Second, the uncertainty principle makes hash out of early precision. The requirements will change making that precision moot. Professional developers understand that estimates can, and should, be made based on low precision requirements, and recognize that those estimates are estimates. To reinforce this, professional developers always include error bars with their estimates so that the business understands the uncertainty.
I once heard Tom DeMarco say, “An ambiguity in a requirements document represents an argument amongst the stakeholders.”
Following the principle of “late precision,” acceptance tests should be written as late as possible, typically a few days before the feature is implemented. In Agile projects, the tests are written after the features have been selected for the next Iteration or Sprint.
I am not a fan of automated GUI testing, much less when there's still lot's to be done for the automation of the components side
Testing through the GUI is always problematic unless you are testing just the GUI. The reason is that the GUI is likely to change, making the tests very fragile. When every GUI change breaks a thousand tests, you are either going to start throwing the tests away or you are going to stop changing the GUI. Neither of those are good options. So write your business rule tests to go through an API just below the GUI.
Keep the GUI tests to a minimum. They are fragile, because the GUI is volatile. The more GUI tests you have the less likely you are to keep them.
There was a job where this is way too far from reality and probably not even a goal, I really like this comment
The best role for the QA part of the team is to act as specifiers and characterizers. It should be QA’s role to work with business to create the automated acceptance tests that become the true specification and requirements document for the system.
Dealing with meetings and there useful or uselessness is always a pain to deal with
There are two truths about meeting. Meetings are necessary. Meetings are huge time wasters.
I know of people that just go to meetings as their job, which is crazy being that they are software developers...
You do not have to attend every meeting to which you are invited. Indeed, it is unprofessional to go to too many meetings. You need to use your time wisely. So be very careful about which meetings you attend and which you politely refuse.
One of the most important duties of your manager is to keep you out of meetings. A good manager will be more than willing to defend your decision to decline attendance because that manager is just as concerned about your time as you are.
Most people probably need to work on the last part of this...
Iteration planning meetings are meant to select the backlog items that will be executed in the next iteration. Estimates should already be done for the candidate items. Assessment of business value should already be done. In really good organizations the acceptance/component tests will already be written, or at least sketched out.
On the need of a peer plus one, not everything can be agreed upon, so you just sometimes need someone to set things straight
Kent Beck once told me something profound: “Any argument that can’t be settled in five minutes can’t be settled by arguing.”
Being purely technical is sought by many, but this makes too much sense, so it's good advice for that personality type
The worst thing a professional programmer can do is to blissfully bury himself in a tomb of technology while the business crashes and burns around him. Your job is to keep the business afloat!
Ever have someone tell you they will be working on a project at 50% percent their capacity? So that's half a team member? The author goes into the why's of this next statement
Now here’s a rule: There is no such thing as half a person.
Teams are harder to build than projects. Therefore, it is better to form persistent teams that move together from one project to the next and can take on more than one project at a time. The goal in forming a team is to give that team enough time to gel, and then keep it together as an engine for getting many projects done.
I me mine
Sunday, April 08, 2012
Sunday, June 12, 2011
Progress in dd
So copying those images with dd sometimes just makes me anxious. Good thing you can get it to print some progress by sending dd the USR1 signal.
sudo kill -USR1 $(pgrep '^dd')
That former command will let me know something is happening when copying stuff to those sd cards that don't even blink ;-)
sudo kill -USR1 $(pgrep '^dd')
That former command will let me know something is happening when copying stuff to those sd cards that don't even blink ;-)
Thursday, April 07, 2011
PackageKit debuginfo on MeeGo
So MeeGo comes with PackageKit, it's not completely stable, so sometimes when things are going wrong
So running the daemon in verbose mode can provide loads of information, we can start by doing that by running this:
sudo /usr/libexec/packagekitd --verbose --disable-timer
That command will run successfully if the daemon isn't running yet of course. Now if we want to go the extra mile and provide a useful bug report we'll need to install debugging packages:
pkcon repo-enable updates-core-debuginfo
pkcon refresh
pkcon install PackageKit-debuginfo libzypp-debuginfo gdb
So now we can attatch gdb to the pid corresponding to packagekitd:
sudo gdb -p `pgrep packagekitd`
If all goes well, you'll see a couple of key messages while gdb is loading like these:
Reading symbols from /usr/libexec/packagekitd...Reading symbols from /usr/lib/debug/usr/libexec/packagekitd.debug...done.
done.
Reading symbols from /usr/lib/libpackagekit-glib2.so.14...Reading symbols from /usr/lib/debug/usr/lib/libpackagekit-glib2.so.14.0.3.debug...done.
done.
Reading symbols from /usr/lib/packagekit-backend/libpk_backend_zypp.so...Reading symbols from /usr/lib/debug/usr/lib/packagekit-backend/libpk_backend_zypp.so.debug...done.
After loading, we'll get a prompt where we just continue the execution by entering a c:
(gdb) c
If there's a crash, we'll have a chance to see or provide a backtrace by typing in bt at the gdb prompt.
Tuesday, December 28, 2010
Tweaking the find-provides and find-requires when building RPMs
Suppose for a second you want to not list all the dependencies for an RPM, they might be contained within your package and might disrupt the rest of the system. To solve that issue and to also keep some automation for the Provides and Requires tags in rpm one would tune them a bit.
Per default, these days at least, this is handled by two scripts
- /usr/lib/rpm/find-requires
- /usr/lib/rpm/find-provides
%define __find_provides [find-provides]
%define __find_requires [find-requires]
Where [find-provides] and [find-requires] are the relative or absolute paths to the scripts you are replacing. Doing only this would not work. This statement is also needed in the spec file:
%define _use_internal_dependency_generator 0
Once that's done everything would work as expected. To make sure this is still the case when reading this, just make sure those are the macro/script combination called upon to check for dependencies:
grep __find /usr/lib/rpm/macros
#%__find_provides %{_rpmconfigdir}/rpmdeps --provides
#%__find_requires %{_rpmconfigdir}/rpmdeps --requires
%__find_provides %{_rpmconfigdir}/find-provides
%__find_requires %{_rpmconfigdir}/find-requires
#%__find_conflicts ???
#%__find_obsoletes ???
This wouldn't affect the redefinition though.
Sunday, November 28, 2010
Getting around Network Manager problems with WPA2
I ran into a problem with some certs some time ago and wanted to write about it for future reference. It was due to some certificate issues and it's formatting or something of the likes.
While the openssl command line utility had no problems with them, network manager refused them to be valid certs.
What was needed to be done was to remake them with openssl
$ openssl pkcs12 -in original.p12 -out temp.pem
$ openssl pkcs12 -in temp.pem -export -name "Repackaged PKCS#12 file" -out new.p12
All this was taken out from a bug report on launchpad.
While the openssl command line utility had no problems with them, network manager refused them to be valid certs.
What was needed to be done was to remake them with openssl
$ openssl pkcs12 -in original.p12 -out temp.pem
$ openssl pkcs12 -in temp.pem -export -name "Repackaged PKCS#12 file" -out new.p12
All this was taken out from a bug report on launchpad.
Subscribe to:
Posts (Atom)