weekly Diigo Post (weekly)        

Posted from Diigo. The rest of my favorite links are here.


          weekly Diigo Post (weekly)        

Posted from Diigo. The rest of my favorite links are here.


          3 Reasons why Chromebooks might be a good fit for your nonprofit        
Nonprofits - 08_11 - Chromebooks.JPG

When we speak with nonprofit organizations, we often hear about the challenges related to technological resources. So when it comes to investing in new technology, it’s important to consider three primary factors:

  • Security: Does it keep my information private and secure?
  • Compatibility: Does it work with the programs I use?
  • Price: Is it within budget?
To address these questions, Google created the Chromebook, a series of laptops built with ChromeOS. The vision behind Chromebooks is simple — to create a safe, accessible, and affordable laptop. To improve user privacy and security, Chromebooks  automatically update to provide virus protection, encryption and safe browsing. For easy access and collaboration, they’re outfitted with Gmail, Google Docs, Hangouts (and nonprofits receive the full Google Apps bundle with 30GB of space per user at no charge). What’s more, they start at $169 USD & that’s for a laptop that has up to 10+ hours of battery life!
Nonprofits_-_08_11_-_Chromebooks2.width-1600.png
ASUS Chromebook C201 ($169)

Case Study

Charity:water, a non-profit organization that provides clean and safe drinking water to people in developing countries, has a “100 percent model,” where every dollar donated goes directly to fund clean water projects. As a result, resources are limited. In order to cover operational costs like salaries and supplies, the organization relies on a few passionate and dedicated supporters. With this in mind, Charity:water transitioned to Chromebooks to improve the efficiency of its staff’s workflow. Now, employees can spend more time focusing on their goals and working towards their mission to nourish the world.

Want to learn more?

Chromebooks gives nonprofits unified access to the Google Apps suite, including:

  • Google Docs, Sheets, Slides: Allows you to create documents, spreadsheets, and presentations in real time. They’re automatically backed up online, and you can also open and edit Microsoft Word, Powerpoint or Excel files.
  • Google Hangouts: Google Hangouts can be used to make phone calls, screenshare, and video chat.
  • Google Drive: Store, sync, and share documents in the cloud for secure and easy access.

As a nonprofit, you also receive discounted access to Chrome licenses, which give you management controls via the Chrome Device Management. Chrome Device Management is a unified way to manage all of your nonprofits’ users, devices, and data. For nonprofits, the Chrome management license is discounted to only $30 dollars — in comparison to $150!

Chromebooks are our vision for providing cheaper, easier to use, and more secure laptops. Installed with Google Apps out of the box, nonprofits can maximize impact, while saving both time and resources.


To see if your nonprofit is eligible to participate, review the Google for Nonprofits eligibility guidelines. Google for Nonprofits offers organizations like yours free access to Google tools like Gmail, Google Calendar, Google Drive, Google Ad Grants, YouTube for Nonprofits and more. These tools can help you reach new donors and volunteers, work more efficiently, and tell your nonprofit’s story. Learn more and enroll here.

To learn more about Chromebooks for nonprofits, take a look at Google for Work’s Chromebook’s website. To take advantage of the Google Nonprofit license discount, a Google partner will reach out to you once you fill out the Contact Us form.


          Improve Your Credit Score - Part 3        
My system actually won't work for everybody.

As a matter of fact, it won't work for people
who don't take action and people who think
that their credit problems will magically go
away.

Then this isn't for them. It simply won't
work for everybody, but for people who do
take action and follow directions and who are
willing to put in a little effort, I think
they get outstanding results.

'THANK YOU' for taking action and doing what it
takes to have great credit.

Here's the special website
-------------------------------------
Get SIX Free Bonuses!
-------------------------------------

*Bonus #1: My PROVEN Step-By-Step Credit
----
Repair Letters
These proven letters are the EXACT letters that
the big credit repair agencies use. Simply put,
they work like crazy! (Just copy, paste, and send)

*Bonus #2: The Budgeting Success Guide!
----
A complete budgeting guide with 3 PDF guides to
help you track your credit card debt, income, and
expenses.

*Bonus #3: The Interest Annihilator Phone Script!
----
Use this simple script to annihilate your interest
rates on your credit cards immediately!

*Bonus #4: Interest Elimination Video
----
Real Life Video Showing How To eliminate your high
interest payments

*Bonus #5: The Complete Transcripts of All
Audio Interviews!
----
Complete Transcripts of the 4 interviews with
credit experts Donna Fox, Scott Bilker, and
Brett Bruce.

*Bonus #6: My "Snowball" Reduction System
----
This special spreadsheet will help you pay
off your debt faster, quicker, and easier than
ever before. Just plug in your numbers and see
how to reduce your large bills by "following the plan."

Go here right now and claim your copy before it's
too late

          Tại sao không nên tin vào chính mình?        
Bertrand Russell có một câu nói nổi tiếng là “Vấn đề của thế giới chính là lũ điên thì lại chắc chắn về bản thân mình còn người thông minh thì lúc nào cũng trăn trở về điều đó.”

Sau nhiều năm chiêm nghiệm, tôi đã nhận ra được tầm quan trọng của việc tìm kiếm sự an yên giữa những bất ổn và mơ hồ, giữa những thắc mắc về niềm tin và mộng ước, và hơn cả giữa những hoài nghi về chính bản thân mình. Trong các bài viết trước đây, tôi thường xuyên lặp đi lặp lại một thông điệp là bộ não của chúng ta thực chất không hề đáng tin chút nào. Chúng ta không hề có bất cứ một cơ sở nào về việc chúng ta đang nói, nghĩ, làm,…

Nhưng tôi chưa có dịp để đưa ra những lý giải và dẫn chứng cụ thể cho vấn đề trên. Và bây giờ là lúc thích hợp để chúng ta cùng nhau khám phá ra 8 lý do theo góc nhìn tâm lý học tại sao chúng ta không nên tin chính bản thân mình.

1. CHÚNG TA CÓ XU HƯỚNG THIÊN VỊ CHO BẢN THÂN VÀ TRỞ NÊN ÍCH KỶ DÙ KHÔNG HỀ NHẬN RA

Có một thuật ngữ trong tâm lý học mang tên Góc nhìn sai lệch của người quan sát (the Actor-Observer Bias).

Chẳng hạn nếu bạn thấy một người vượt đèn đỏ ở ngã tư đường, có thể bạn sẽ nghĩ người này thật đáng trách, chỉ vì không kiên nhẫn chờ nổi vài giây mà đã gây nên nguy hiểm cho bao nhiêu người đi đường.

Ngược lại, nếu bạn chính là người đang vượt đèn đỏ, bạn sẽ tự nhủ với bản thân mình rằng đó chỉ là do vô ý mà thôi, bởi vì cái cây chết tiệt kia chắn mất tầm nhìn, bởi vì xưa giờ người vượt đèn đỏ có gây ra hề hấn gì đâu.

Cùng một hành động, nhưng nếu người khác làm thì họ nghiễm nhiên trở thành một tên khốn, còn nếu bạn làm thì đó chỉ là do vô ý.

Chúng ta đều có những lúc như thế, đặc biệt là khi phải đối diện với các vấn đề mâu thuẫn. Khi nói về một người đã từng làm tổn thương mình, chúng ta sẽ mô tả họ như một giống loài vô cảm, thiếu trách nhiệm, có ác ý làm hãm hại người khác. (1)

Tuy nhiên, khi chúng ta nói về việc mình đã làm tổn thương ai đó, chúng ta sẽ viện đủ lý do chỉ để biện minh rằng hành động của chúng ta hoàn toàn hợp lý. Trong trường hợp này, tâm trí chúng ta ngụy biện rằng những tổn thương mà chúng ta gây ra cho người khác không đáng là bao, thế nên việc bị buộc tội bản thân thật là vô lý.

Cả hai góc nhìn này đều hoàn toàn sai. Nhiều nghiên cứu tâm lý đã chỉ ra rằng cả thủ phạm và nạn nhân đều sẽ xuyên tạc sự thật bằng cách kể lại câu chuyện kèm theo quan điểm của mình.

Steven Pinker gọi hiện tượng này là Lỗ hổng Luân lý (Moralization Gap).(2) Bất cứ khi nào xung đột xuất hiện, chúng ta thường sẽ đánh giá quá cao thiện chí của bản thân và ngược lại, đánh giá xấu chủ ý của người khác. Điều này khiến chúng ta luôn tin rằng những người kia đáng để bị trừng phạt trong khi chúng ta phải được khoan hồng.

Tâm trí chúng ta lúc này hoạt động một cách vô thức vì kể cả khi đang biện minh cho chính mình, chúng ta đều luôn nghĩ nó hoàn toàn hợp lý. Nhưng thực ra sự thật không phải như vậy.

2. BẠN KHÔNG HỀ BIẾT ĐIỀU GÌ LÀM MÌNH HẠNH PHÚC (HOẶC ĐAU KHỔ)

Trong cuốn sách Stumbling on Happiness, Daniel Gibert, nhà tâm lý học tại đại học Harvad, đã chỉ ra việc chúng ta rất tệ trong việc nhớ điều gì tác động đến cảm xúc của bản thân trong quá khứ cũng như tương lai.

Chẳng hạn, nếu đội thể thao ưa thích của bạn bị thua tại giải thi đấu lớn, bạn sẽ cảm thấy vô cùng buồn. Nhưng hóa ra trí nhớ về việc bạn cảm thấy buồn như thế nào lại không liên quan gì đến việc bạn cảm thấy buồn trong một khoảng thời gian nhất định. Nói cách khác, mỗi khi nhớ về những điều tệ hại lúc xưa, chúng ta thậm chí còn thêm thắt quá nhiều cảm xúc tiêu cực so với tình hình thực tế lúc đó. Tương tự, khi nhớ về những kí ức tốt đẹp lúc trước, chúng ta thường cho là chúng vui hơn so với thực tế lúc đó.

Thế nên khi hoạch định về tương lai, chúng ta thường đánh giá quá cao niềm hạnh phúc mà những điều tốt đẹp sẽ mang lại cũng như tin rằng những điều không may mắn sẽ làm chúng ta cảm thấy tệ hại khôn cùng. Để rồi chúng ta thường không nhận ra được bản thân mình đang thực sự cảm thấy như thế nào trong thời khắc của hiện tại.

Đây cũng là một lý do mà mọi người biện minh về việc không thể theo đuổi được hạnh phúc. Tất cả mọi thông tin chỉ ra rằng chúng ta thậm chí còn không biết hạnh phúc thực chất như thế nào thì làm sao có thể thực sống trong hạnh phúc.

3. CHÚNG TA DỄ BỊ LỢI DỤNG ĐỂ ĐƯA RA NHỮNG QUYẾT ĐỊNH SAI LẦM

Bạn đang đi trên đường thì bỗng có một người lạ lịch sự đến gần với mong muốn được tặng cho bạn vài cuốn sách báo “miễn phí”, ngay khi bạn cầm lấy chúng thì họ hỏi bạn tham gia cái này cái kia và quyên cho họ chút tiền vì họ đang gặp khó khăn. Đã bao giờ bạn lâm vào tình huống này chưa? Bạn biết là điều này thật kì cục và bạn muốn từ chối. Nhưng mà, cái này được tặng free đấy với cả bạn chả muốn bị coi là kẻ hờ hững với hoàn cảnh khó khăn chút nào.

Yeah, đó chính là mấu chốt của vấn đề.

Hóa ra là, quyết định của con người dễ bị chi phối bằng nhiều cách khác nhau, một trong số đó là việc tặng “quà” cho họ trước khi yêu cầu họ trả lại (điều này khiến khả năng giữ lại món quà cao hơn).

Hoặc thử cách này, lần tới nếu bạn thấy một dòng người xếp hàng dài ngoằng và muốn chen lên trước, chỉ việc đưa ra một lý do cho mọi người – bất cứ lý do nào – chẳng hạn “Tôi đang vội” hoặc “Tôi đang ốm”. Theo thí nghiệm, chỉ cần nhờ lý do ngắn gọn này mà bạn có đến 80% cơ hội thành công để chen lên trước hơn là không đưa ra lời giải thích. Điều tuyệt vời nhất là: cái lý do bạn đưa ra không cần phải logic hay có ý nghĩa gì cả.(3)

Các nhà kinh tế học thành vi đã chỉ ra rằng chúng ta dễ dàng có xu hướng tập trung vào một loại giá thành hơn các loại khác dù chả vì lý do gì. Hãy nhìn vào hình vẽ dưới đây:

Ở phía bên trái, sự khác biệt về giá có vẻ lớn và không hợp lý. Thế nhưng, nếu thêm chai rượu mức giá $50 như hình bên phải thì đột nhiên, chai rượu với giá $30 bỗng trở thành một mức vừa phải và có lẽ là một món hời.

Hãy thử xem một ví dụ khác nữa. Có người nói rằng bạn cần phải chi $2,000 cho chuyến đi Paris bao gồm bữa ăn sáng hoặc một chuyến đi Rome kèm bữa ăn sáng hoặc một chuyến đi Rome không kèm bữa ăn sáng. Hóa ra việc đưa thêm một lựa chọn “Rome không kèm ăn sáng” đã giúp cho nhiều người chọn Rome kèm ăn sáng hơn Paris. Tại sao vầy? Bởi vì nếu so sánh với chuyến đi Rome mà không có bữa ăn sáng, việc đi Rome kèm bữa ăn sáng có vẻ là một món hời. Thế là não bộ của chúng ta nhanh chóng quên đi mất chúng ta vẫn còn một sự lựa chọn nữa mang tên Paris.(4)

4. BẠN THƯỜNG CHỈ DÙNG LOGIC VÀ LÝ LUẬN ĐỂ LÝ GIẢI CHO NHỮNG NIỀM TIN CÓ SẴN CỦA BẢN THÂN

Các nhà nghiên cứu đã chỉ ra rằng kể cả những người đã bị tổn hại chức năng thị lực trên não bộ vẫn có thể “thấy” được ngay cả khi họ không nhận ra. Những người mù hay nói rằng họ không thể nhìn thấy được tay của mình ngay cả khi đưa nó ra trước mặt. Nhưng nếu có một ánh đèn lóe lên ngay trước mắt họ, dù ở phía bên phải hay trái, họ vẫn thường đoán chính xác được đèn lóe lên bên nào.

Và dù vậy, họ vẫn cứ nói bạn rằng đó hoàn toàn là đoán đại mà thôi.

Họ không hề có một gợi ý nào về việc đèn sáng bên nào nhưng vẫn có nhận định được ánh đèn nằm ở đâu.

Điều này thể hiện một sự trớ trêu nực cười: kiến thức và cảm giác biết kiến thức đó hoàn toàn không liên quan đến nhau.(5)

Và cũng giống như những người mù, chúng ta đều có kiến thức mà lại thiếu đi cảm giác biết kiến thức. Nhưng điều ngược lại, bạn có thể cảm tưởng là mình đã biết điều đó rồi mặc dù thực ra bạn không hề, cũng hoàn toàn là thật.

Đây chính là nguồn cơn cho mọi thiên vị cũng như ngụy biện. Chúng ta không nhận thức được sự khác biệt giữa cái mình thực sự biết và cái mình tưởng là mình biết. Do đó, các lý do kèm động cơ và các ngụy biện đã được xác nhận cứ liên tục diễn ra trong đời sống hàng ngày.

5. CẢM XÚC CHI PHỐI NHẬN THỨC CỦA BẠN NHIỀU HƠN BẠN NGHĨ

Nếu giống như hầu hết mọi người, bạn thường có xu hướng đưa ra những quyết định tồi dựa trên cảm xúc. Đồng nghiệp chỉ buông vài câu bông đùa về đôi giày của bạn mà bạn đã nhảy dựng đứng lên, bởi vì đó là di vật bà người bà đã mất trao lại. Thế là bạn quyết định “phải bằm nát lũ người này ra”, bỏ việc và sống dựa vào tiền phúc lợi. Đây hoàn toàn là một quyết định không hợp lý.

Nhưng việc tệ nhất chưa phải nằm ở đây đâu.

Kể cả khi chúng ta đã nhận thức được việc cảm xúc chi phối các quyết định, chúng ta đã tránh việc đưa ra các quyết định trong lúc đang có nhiều cảm xúc trong lòng, thì đây vẫn không phải là một giải pháp lâu dài. Bởi vì, cảm xúc có thể ảnh hưởng đến chúng ta hàng tháng, hàng năm, kể cả khi chúng ta đã nguôi giận và bắt đầu “phân tích” tình huống trước đó. Những cảm xúc ngắn hạn tại một thời điểm nào đó, hóa ra, lại có tác động lâu dài đến các quyết định chúng ta đưa ra sau này.(6)

Lấy ví dụ như thế này. Một người bạn bỗng muốn rủ bạn đi uống nước. Nhưng vì lý do nào đó, cơ chế phòng vệ của bạn bỗng được dựng lên. Bạn không muốn gật đầu cái rụp dù bạn cũng rất quý anh bạn ấy và muốn được trò chuyện với anh ta nhiều hơn. Bạn thận trọng trong việc đưa ra một phương án dù bạn không hiểu tại sao mình phải như vậy.

Điều mà bạn đã lãng quên chính là bạn đã từng có một người bạn tính tình nóng lạnh thất thường trước đây. Anh chàng đó cứ thỉnh thoảng lại nổi đóa lên bất thường với bạn. Bạn bước tiếp trong cuộc sống và dần quên mất những chi tiết đó trong quá khứ. Mối quan hệ giữa bạn và người cũ thậm chí còn quay trở lại bình thường.

Lúc đó bạn đã có đôi lần cảm thấy tổn thương và bực bội. Có thể về mặt nhận thức, bạn đã quên đi câu chuyện cũ, nhưng cảm xúc của bạn thì không. Nó vẫn đã từng nhớ mình đã cảm thấy buồn bã như thế nào. Để rồi khi giờ đây đối diện với một người hoàn toàn khác, trong một tình huống hoàn toàn khác, nó vô thức dựng lên hàng rào bảo vệ cho chính bạn.

Chúng ta thường dựa vào kí ức về những cảm xúc mà chúng ta đã có trong một thời điểm nào đó trước đây để làm nền tảng đưa ra nhiều quyết định sau đó. Vấn đề là, bạn hoàn toàn không có ý thức về điều này. Cảm xúc mà bạn đã có 3 năm trước đây có thể ảnh hưởng đến bạn bất cứ lúc nào.

Bàn về trí nhớ…

6. TRÍ NHỚ KHÔNG HỀ ĐÁNG TIN

Elizabeth Loftus là một trong những nhà nghiên cứu tiên phong trong lĩnh vực trí nhớ và cô ấy là người đầu tiên nói với chúng ta rằng trí nhớ của mình hoàn toàn không hề đáng tin.

Về cơ bản, cô nhận ra rằng trí nhớ của chúng ta về các sự kiện trong quá khứ thường bị thay đổi bởi các sự kiện khác lúc xưa và/hoặc các thông tin thiếu chính xác mới. Cô đã khiến cho nhiều người nhận ra rằng lời khai từ nhân chứng không hẳn là một chứng cứ đáng tin.(8)

Loftus và các nhà nghiên cứu khác đã chỉ ra rằng:

Trí nhớ của chúng ta không chỉ bị mất dần mà còn dễ đưa ra các thông tin sai lệch theo thời gian.
Cảnh báo mọi người rằng kí ức của họ chứa các thông tin sai lệch cũng không giúp làm giảm đi các thông tin sai lệch.
Bạn càng có tính cảm thông nhiều bao nhiêu thì bạn càng dễ đồng nhất các thông tin sai lệch vào trong trí nhớ của mình bấy nhiêu.
Trí nhớ của chúng ta không chỉ bị thay đổi bởi các thông tin sai lệch mà có thể toàn bộ kí ức của chúng ta đã bị tạo ra. Điều này thường diễn ra khi người tạo ra kí ức của chúng ta lại chính là những người mà chúng ta tin tưởng.
Vì thế, kí ức của chúng ta hoàn toàn không hề đáng tin như mình đã nghĩ – thậm chí với những việc chúng ta nghĩ là đúng thì chỉ có phần chúng ta nghĩ là sự thật mà thôi.

Trên thực tế, các nhà thần kinh học đã dự đoán lúc nào con người sẽ nhớ sai thông tin dựa trên các hoạt động trí não của bạn. Tại sao vậy?

Họ lý giải điều này dựa trên việc so sánh trí nhớ với bộ nhớ của máy tính. Ban đầu, bộ nhớ sẽ ghi lại hết tất cả mọi thứ được đưa vào. Sau đó, chúng chậm dần, làm mất hoặc hư các file dữ liệu ngay sau khi bạn đã tạm dừng sử dụng. (10)

Nhưng bộ não của chúng ta không chỉ chứa spreadsheet, và file chữ hay GIFs. Trí nhớ giúp chúng ta học được từ các sự kiện trong quá khứ để từ đó đưa ra những quyết định đúng đắn hơn cho tương lai. Nhưng trí nhớ thực chất còn có 1 chức năng quan trọng và phức tạp khác mà chúng ta ít khi nghĩ đến.

Là người, chúng ta cần sự nhận dạng, chúng ta cần có cảm giác về việc “chúng ta là ai”, để từ đó điều hướng được các tình huống xã hội phức tạp cũng như để hoàn thành xong công việc. Trí nhớ giúp chúng ta biết được sự nhận dạng nhờ các câu chuyện trước đó.

Bằng cách này, việc kí ức có chính xác hay không không quan trọng. Điều mấu chốt chính là các câu chuyện của quá khứ đã tạo nên nhận thức về chính bản thân. Và thay vì phải sử dụng trí nhớ với độ chính xác tuyệt đối để thực hiện điều này, chúng ta thực chất chỉ cần các kí ức mờ nhạt, sau đó thêm thắt vào các chi tiết khác nhau để phù hợp với “cái tôi” mà chúng ta tạo ra và chấp nhận.

Có thể bạn nhớ rằng anh hay bạn bè đã từng bắt nạt mình rất nhiều và mình đã cảm thấy tổn thương như thế nào. Với bạn, đó là lý giải cho việc bản thân dễ nhạy cảm, hay lo lắng và khá e dè. Nhưng có lẽ kí ức đó không hề làm bạn tổn thương nhiều như mình đã nghĩ. Có lẽ khi nhớ về việc anh trai đã bắt nạt bạn lúc xưa, bạn đã dùng những cảm xúc ngay lúc này – nhạy cảm, lo lắng và e dè – và neo nó vào kí ức cũ, dù đống cảm xúc đó có khi lại chả hề liên quan gì đến việc anh trai đã bắt nạt bạn.

Kể từ giờ phút này, kí ức về việc ông anh trai đã nhẫn tâm và làm bạn tổn thương, dù đúng hay không, cũng đã gắn vào đặc điểm nhận dạng của bạn – một người dễ nhạy cảm, hay lo lắng. Đồng thời, đặc điểm này đã khiến bạn đưa ra những hành động đáng xấu hổ và tạo thêm nhiều đau khổ trong cuộc đời mình.

Có phải bạn đang hỏi tôi là: “Mark, vậy thì việc “tôi nghĩ tôi là ai” chỉ là một đống những ý tưởng bịa đặt do não bộ tạo ra thôi ư?”

Vâng, đúng thế đấy.

7. ‘BẠN KHÔNG PHẢI LÀ ĐIỀU MÌNH NGHĨ

Hãy xem xét khía cạnh này: cách bạn thể hiện bản thân trên Facebook có lẽ không thực sự giống như cách bạn thể hiện bản thân khi rời “thế giới ảo”. Cách bạn hành xử lúc ở gần ông bà có lẽ sẽ khác khá nhiều so với lúc đi chơi với bạn bè. Bạn có “cái tôi tại công việc”, “cái tôi tại nhà””, “cái tôi trong gia đình”, “cái tôi khi ở một mình” và còn rất nhiều “cái tôi” khác nữa mà bạn dùng để điều hướng và sống sót trong thế giới xã hội phức tạp này.

Nhưng đâu mới là cái tôi “đích thực”?

Chúng ta có thể nghĩ rằng có lẽ một trong những bản sao ở trên thể hiện chính mình thực nhất. Nhưng một lần nữa, tất cả những việc bạn làm chỉ là lặp đi lặp lại câu chuyện vốn đã chiếm hữu trong đầu mình, những câu chuyện được tạo ra bởi các thông tin sai lệch.

Trong hơn một vài thập kỉ, các nhà tâm lý học xã hội đã khám phá ra một sự thật mà chúng ta khó lòng nào chấp nhận: cái tôi đích thực (core self) – cái tôi không thay đổi, cái tôi bất biến – chỉ là ảo tưởng mà thôi. (11) Một nghiên cứu mới đang bắt đầu chỉ ra việc cách thức bộ não tạo nên cảm giác về cái tôi cũng như việc sử dụng các thuốc gây ảo giác có thể tạm thời làm thay đổi sự vận hành của não bộ, khiến não tạo ra những ảo giác tạm thời về các định dạng bản thân.(12)

Điều trớ trêu là tất cả những thí nghiệm này – tuy được xuất bản trong những cuốn sách đắt tiền, được viết bởi những chuyên gia có tiếng – hóa ra chỉ đang lặp lại các triết lý truyền thống phương Đông mà các thiền sư giảng giải trong suốt hàng thế kỉ nay. Trong khi việc duy nhất mà các thiền sư làm chỉ là ngồi yên trong các hang động và không nghĩ gì cả trong vài năm.(13)

Ở phương Tây, con người thường xuyên đề cao cái tôi cá nhân trong nhiều nền văn hóa – chưa kể tới trong ngành công nghiệp quảng cáo – khiến chúng ta không ngừng tìm cách “định dạng” bản thân mà hiếm khi chịu dừng lại đủ lâu để tự vấn rằng liệu điều này có cần thiết để lao đầu vào hay không. Có lẽ ý tưởng về việc “nhận dạng bản thân” vừa hại vừa giúp chúng ta. Có lẽ nó đang trói buộc chúng ta nhiều hơn là giải phóng chính mình. Tất nhiên, thật tuyệt khi bạn biết mình muốn hoặc thích gì, nhưng bạn vẫn có thể theo đuổi ước mơ và mục tiêu của bản thân mà không cần phải dựa vào ý niệm chắc chắn về sự định dạng.

8. TRẢI NGHIỆM THỂ CHẤT CỦA BẠN TRONG THẾ GIỚI NÀY KHÔNG HOÀN TOÀN LÀ THẬT

Chúng ta đang sở hữu một hệ thống thần kinh phức tạp. Nhờ có nó mà thông tin không ngừng được truyền đến não bộ. Theo một số tính toán, hệ thống giác quan – thị giác, xúc giác, khứu giác, thính giác, vị giác và sự cân bằng – gửi 11 triệu bit thông tin đến não chỉ trong vài giây. (14)

Nhưng kể cả như thế, chúng ta vẫn không thể nhận thức được những sự vật sự việc li ti, khó nhìn thấy. Ánh sáng mà chúng ta thấy được chỉ là một phần nhỏ trong dải quang phổ điện từ. Chúng ta không thể thấy được điều này trong khi các loài chim hay côn trùng lại có thể. Chó có thể nghe và ngửi được những thứ mà chúng ta thậm chí không ý thức được sự tồn tại. Hệ thần kinh của chúng ta hoạt động vừa như bộ máy thu thập thông tin vừa như bộ máy lọc thông tin.

Hơn hết, tâm thức của con người chỉ có thể kiểm soát được 60 bit thông tin trong một giây khi chúng ta đang tham gia vào các hoạt động gắn với trí thông minh (đọc sách, chơi nhạc cụ, …) (15)

Có nghĩa là, bạn chỉ có thể ý thức được khoảng 0.000005454% phần thông tin mà não bộ nhận được trong khi đang còn tỉnh.

Để dễ hiểu hơn, hãy tưởng tượng là với mỗi chữ mà bạn nhìn thấy và đọc trong bài viết này, có 536,303,630 chữ khác mà bạn không thể nhìn thấy.

Về cơ bản, đó là cách chúng ta sống trong thế giới này mỗi ngày.

Tác giả: Mark Manson
Dịch: Hạnh Nguyên
Nguồn: https://markmanson.net/trust

Ghi chú:

1. See Roy Baumeister and Aaron Beck’s Evil: Inside Human Violence and Cruelty.↵
2. See: The Better Angels of Our Nature by Steven Pinker, Chapter 8, to be exact.↵
3. These experiments and more explained in Robert Cialdini’s timeless book Influence.↵

4. This is a shitty summary of an experiment conducted by Dan Ariely of Duke University, discussed in his excellent book Predictably Irrational.↵

5. In fact, your brain has completely independent processes for each of these and both function independently of logic and reason. See Dr. Robert Burton’s book On Being Certain: Believing You Are Right Even When You’re Not.↵

6. Andrade, E. B., & Ariely, D. (2009). The enduring impact of transient emotions on decision making. Organizational Behavior and Human Decision Processes, 109(1), 1–8.↵

          Comment on Spreadsheet FUN by Jed I Knight        
YAAAAAAAAAYYYYYYYY SSSNNAAAKKKEEES
          Getting started with The Gamma just got easier        

Over the last year, I have been working on The Gamma project, which aims to make data-driven visualizations more trustworthy and to enable large number of people to build visualizations backed by data. The Gamma makes it possible to create visualizations that are built on trustworthy primary data sources such as the World Bank and you can provide your own data source by writing a REST service.

A great piece of feedback that I got when talking about The Gamma is that this is a nice ultimate goal, but it makes it hard for people to start with The Gamma. If you do not want to use the World Bank data and you're not a developer to write your own REST service, how do you get started?

To make starting with The Gamma easier, the gallery now has a new four-step getting started page where you can upload your data as a CSV file or paste it from Excel spreadsheet and create nice visualizations that let your reader explore other aspects of the data.

Head over to The Gamma Gallery to check it out or continue reading to learn more about creating your first The Gamma visualization...


          The Gamma dataviz package now available!        

There were a lot of rumors recently about the death of facts and even the death of statistics. I believe the core of the problem is that working with facts is quite tedious and the results are often not particularly exciting. Social media made it extremely easy to share your own opinions in an engaging way, but what we are missing is a similarly easy and engaging way to share facts backed by data.

This is, in essence, the motivation for The Gamma project that I've been working on recently. After several experiments, including the visualization of Olympic medalists, I'm now happy to share the first reusable component based on the work that you can try and use in your data visualization projects. If you want to get started:

The package implements a simple scripting language that anyone can use for writing simple data aggregation and data exploration scripts. The tooling for the scripting language makes it super easy to create and modify existing data analyses. Editor auto-complete offers all available operations and a spreadsheet-inspired editor lets you create scripts without writing code - yet, you still get a transparent and reproducible script as the result.


          This is what privatisation did to Australia's household electricity bills        

When three eastern and one southern state formed the National Electricity Market in December 1998 Australia had the lowest retail prices in the world along with the United States and Canada.

The rules which underpin this National Electricity Market are created by the Australian Energy Market Commission (AEMC) set up by the Council of Australian Governments (COAG) - through the COAG Energy Council - for that purpose and to advise federal & state governments on how best to develop energy markets over time.

The Australian Energy Regulator (AER) sets the amount of revenue that network businesses can recover from customers for using networks (electricity poles and wires and gas pipelines) that transport energy.

So far so good. There's a defined market and there are rules.

Then the privatisation of electricity supply and infrastructure began in earnest.

It should come as no surprise that this push towards full privatisation, with its downhill spiral in service delivery and uphill climb in cost to retail customers, began and was progressed during the term of Liberal Prime Minister John Howard.

By 2017 the NSW Berejiklian Coalition Government has almost completed its three-stage privatisation of state power infrastructure by selling off poles and wires and, it goes without saying that the retail cost of electricity is expected to rise again next year.

This is where we stand today……………………

[Graphs in Financial Review, 4 August 2017]
The Financial Review, 4 Augut 2017:

The annual cost to households of accepting a standing offer from one of the big three retailers instead of the best offer in the market has been estimated at $830 in Victoria, $900 in Queensland and $1400-$1500 in NSW and SA by the St Vincent de Paul Society.

Mr Mountain said power bills are constructed in such a complex way that ordinary customers without sophisticated spreadsheet and analytical skills have little hope of analysing competing offers to work out which offers them the best deal.

Private comparison websites do not include all market offers and charge retailers for switching customers, while the websites offered by the Australian Energy Regulator and the Victorian government do not provide the tools customers need to discriminate among offers.

Prime Minister Malcolm Turnbull has ordered the Australian Competition and Consumer Commission (ACCC) to conduct an inquiry into electricity supply, costs and pricing, including retail pricing.

The Treasurer should have a preliminary report from the ACCC in his hands by the end of September this year, however this body does not submit a final report until 30 June 2018 with no guarantee that any recommendations will be adopted by government and industry.

Quite frankly, it appears the privatisation train left the platform some time ago and there is no way to halt or divert it in order to genuinely benefit household consumers.


          Thank you Apple and Google        
Some people were not very excited about Apple's keynote yesterday but the 1.1.3 firmware update for the iPhone is plenty enough for me, along with Google's updated mobile apps. I use Gmail a lot on my iPhone and one of my clients has standardized on Google Mail/Docs for their communications so I'm constantly reading mail and documents on my iPhone. Gmail was OK on the iPhone and Google Docs was bearable but Google Reader was a nightmare. At the weekend, I noticed Gmail suddenly got a lot nicer with a very iPhone-style UI, sliding panels between labels and mail. Great... now what about the other apps? Tuesday night, I got home from said client's site and eagerly updated my iPhone firmware. The new "location" feature in the Maps application is very sweet (and seems sufficiently accurate for my needs). Then I started reorganizing my home screen. Screens. That's when I noticed that Google had updated most of its apps to be iPhone-friendly. Google Docs makes a great reader now, even for fairly large spreadsheets. Google Reader is a huge improvement! So now my iPhone has:
  • 43actions - a great little GTD (Getting Things Done) task manager
  • Calculator
  • Calendar
  • Clock - with 10 cities
  • Maps
  • Notes
  • Stocks
  • Weather
Followed by: Then my menu bar is:
  • Mail
  • Phone
  • Safari
  • Settings
On screen two, I have a row of games: Then my multimedia tools:
  • Camera
  • iTunes
  • iPod
  • Photos
  • Text
  • YouTube
And, yes, they are in alphabetical groups. Call me anal retentive and see if I care! Anyway, a big thank you to Apple and Google (and those games companies) for making my iPhone an even more lovable and addictive little toy!
          ESM's QuickLessons A DearMYRTLE Genealogy Study Group Lesson 20        

Hilary Gadsby


QuickLesson 20: Research Reports for Research Success
Elizabeth Shown Mills, “QuickLesson 20: Research Reports for Research Success," Evidence Explained: Historical Analysis, Citation & Source Usage (https://www.evidenceexplained.com/content/quicklesson-20-research-reports-research-success  :  accessed 17 Sept 2016).

This week we will be discussing the research process.

How do we do our research?

How should we do our research?

Can we improve how we research?

With the growth of the internet how many of us can find ourselves joining in with the quick click genealogy we frequently criticise.

Why do we criticise this way of doing things?

  1. Insufficient preparation
  2. Poorly recorded
  3. Insufficient analysis
So what should we be doing?
Ask yourself these questions.
  1. What do I want to find?
  2. Where should I be doing my research?
  3. How am I going to do the research?
  4. How am I going to record what I find?
  5. How am I going to review what I find?
We can use pen and paper or our computers to assist in these tasks.

We know how to interrogate the databases online and how to enter our results in our software program. There is plenty of information to tell us how to do this either digital or paper.

Do any programs tell us what we need to look for?

Do any programs tell us whether what we find is relevant?

Poor preparation and lack of analysis can lead to hours of wasted research.

How can we know what we need to find if we have not analysed what we already know.

Creating a research plan will be the best thing you do. It will keep you on track. 

If we wish to move on from being just "information gatherers and processors" as ESM states in this lesson we must consider how we approach our work.

This weekend I came across an individual who had been recorded by another researcher in the Wiki Tree website with the maiden name of ROSLING. However this was not the surname for the parents. The link to the 1911 census revealed that she was recorded as their adopted daughter. There was also a link to an army record showing her date of birth in keeping with the census record.
I am researching the surname ROSLING and was interested in knowing where she fitted in the lineage I am constructing.
If I just entered her name in a search would I find anything and how would I know if what I found was relevant.

Experienced researchers will often know exactly where to research and which records may help them find what is available. This does not preclude them from the planning stages but it may reduce the time needed to formulate the plan. Even the experts find themselves stumped occasionally and have to consider alternative strategies. Researching in a new area be it geographical or an unfamiliar set of records may require a different skill set and a whole new learning experience. If we are to complete a thorough research we have to be aware of the resources available. 

Even the best plans may need to be altered in the light of new information. Being prepared and analysing what has been found may alter our focus or the manner in which we carry out our research.
The ability to plan and analyse helps us make better use of the research time.

Complex questions may only be answered if we look at all the information we have and understand what it's telling us. 
Some researchers have found that a program such as Evidentia can help them formulate a plan for these complex problems. By entering each piece of information deciding what it is saying and importantly how reliable that information may be we have a clearer understanding of what we already know. 
The source of any information may be flawed. Awareness of the reliability and being able to resolve conflicting information are analysis skills that may only come threw experience and education. 
Learning from others and sharing personal experience helps each of us become better researchers by improving the knowledge base.

Do we read any accompanying information about a record group that we find online before we enter a name in the search box. If not, why not, surely we need to know if the record will be likely to provide us with the information we need before we search. Would you travel miles to an archive or cemetery without checking that they have what you are looking for first. The same should be true for online records. Finding information and blindly entering it into a database is as boring and pointless as writing lines was as a school punishment. If you want the reward of finding that elusive connection you need to spend time preparing and analysing, formulate a plan, familiarise yourself with what may be available, pinpoint the best way to approach the task and adapt the plan as and when more information is discovered. Not forgetting that negative results do not mean negative evidence, it may be that any record has just not survived.

As we near the end of this study group, we need to pull together all that we have discussed.

I am writing about my research mentioned above on my One Name Study blog. I have not included specific examples this week as I believe that this lesson is more about understanding the process and the importance of doing this well. 
Only we as individuals know whether we have been disciplined in the past.
Hopefully our discussions may have helped at least one of those watching to become researchers rather than gatherer/processors.

Researching when few records or indexes were available online and internet access was expensive.  
I was not aware of research plans so I would go armed with notes that I had made to guide my research. 
Whilst looking for ancestors in the BMD indexes on microfiche I would have a name, range of years, and geographical area. When I found a possible candidate I would record and order a certificate. 
The only way I could access the census was using indexes and then when I could get to the local archive I would have to scroll through the microfilm to find what I wanted. 
The internet has made finding many records easier but has it also created a group of individuals who may believe the adverts that show families building trees using only the online website. 
No website will ever contain all the records and whilst the records support our research they are not the researcher. 
Who pieces together which record is relevant to each individual, who is related to who and how are all these individuals related, it is us as researchers who analyse the information and decide its relevance.

The reporting suggested by Elizabeth Shown Mills may sound quite prescriptive and academic and unless you have an academic background you may switch off at the thought of report writing. However what she is saying is this. 

  1. Compile your findings complete with the information needed to find them again. 
  2. Collect them together in a manner that you are comfortable working with or that fits with your findings.
  3. Summarize what you have found.
  4. Decide whether you have answered your research question.
  5. Decide whether you need to do more research and create a new research plan.
  6. Make a conclusion and write a reasoned report to support this.
Personally I would say that Evidentia will help you do all of these in a guided way.

Finally here is a link to a Google Sheet I created called The Family History Research Process. It contains links to documents that others may find useful. Please add your comments if you think I may have missed something useful that could be added.

          ESM's QuickLessons A DearMYRTLE Genealogy Study Group Lesson 11        

Hilary Gadsby

QuickLesson 11: Identity Problems & the FAN Principle    
Elizabeth Shown Mills, “QuickLesson 11: Identity Problems & the FAN Principle,” Evidence Explained: Historical Analysis, Citation & Source Usage (https://www.evidenceexplained.com/content/quicklesson-11-identity-problems-fan-principle : accessed 25 May 2016).

The FAN Principle referred to in this lesson is looking at friends, associates and neighbours to help find information pertinent to the person we are researching.

Common names can be a particular problem. I have the surnames SMITH and WARD in my husband's family and ROBERTS in mine.
However some surnames can be a problem in particular localities as many are what we call locational surnames and were adopted from the place where the family lived when surname usage started.

I have recently started a One Name study of the surname ROSLING and I shall use an example from this.
The origins of the family I have been researching appear to be in Lincolnshire, England, as far as the current level of research in the UK has shown. 
(My research is at an early stage and earlier records may uncover different origins as I am aware of this surname elsewhere in Europe and it could have been introduced to Lincolnshire from an early invasion)

The name Peregrine Rosling would not be considered to be a difficult name to research as both first and surname are not common. However if you know the family the first name Peregrine is one that has been used by several generations.



This shows the results of a general search for this name at Find My Past



Peregrine Rosling born in Swinstead, Lincolnshire, England has been particularly problematic to follow.There are 2 persons who show on the census records with the same year and place of birth. 

How do I distinguish who belongs to which family?


I have looked at the other census records 



This Peregrine has a wife Eliza who was born in Morton Lincolnshire and appears to be living in the same house with Ann and Edward Rosling could they be close relations.


This Peregrine has a James Mettam widower living with him and his wife Elizabeth he is described as Father of Peregrine and Elizabeth was born in Swinstead. Was his wife Elizabeth Mettam?

The registers held at Lincolnshire Archives have been scanned and digital images are now available to view on the Find My Past website.



Looking at the Baptisms for the parish of Swinstead this is what I have found 
Parish Baptism Register 1813-1871 Swinstead, Lincolnshire
Page 18 No 143 13 Feb 1825 Peregrine son of Peregrine and Ann Rosling Swinstead Labourer
Page 17 No 136 27 June 1824 Peregrine son of Robert and Sarah Rosling Swinstead Farmer

So the first Peregrine could be the son of Peregrine and Ann Rosling?
What was his wife's maiden name?

It is likely that both marriages were registered in the Bourne registration district as all these birthplaces and residences are in this district.
Fortunately Lincolnshire has had many of the marriages transcribed and the indexes can be downloaded. (This link may not be working but I have a copy I downloaded)

I have extracted those of interest and they can be found here.

So how do I confirm I have the correct Peregrine in each family as I now have 3 of them marrying in Swinstead within 10 years. Can I find the one who married in 1846 in the 1851 census and where his wife was born.


So I have discovered the maiden name for each wife and where the wife was born. Each Peregrine had a father with a different first name so I can now have more confidence that I connect each of them and any descendants to the correct family.

Spouse
Birthplace of Spouse
First name of Father
Christening Date
Name of Spouse Father
Eliza
Morton
Peregrine
13 Feb 1825
Charles Wilson
Elizabeth
Swinstead
Robert
27 June 1824
James Mettam
Elizabeth Jane
Castle Bytham
William
1820?
Robert Glenn


When I have done more work on my one name study I will be able to piece together more about how these families are related. The older Peregrine has not been found in the register of baptisms for Swinstead.

I have started to explore other parish records for this area for information.
Every piece of the puzzle is important to ensure we are looking at the right person in that record. 

I have further work to do so that I can discover more about the family of Ann the wife of Peregrine and mother of the younger Peregrine. Having discovered her maiden name and birthplace I find there are at least 2 possibilities for her baptism. Determining who the possible siblings are and what happened to them may help me discover which baptism and family are most likely to be her. I suspect this will involve a lot more analysis of what the records show and I may need to work with unfamiliar records but understanding the importance of who is in the community will help me pull together the clues.


          DIY Wedding - How to Design and Create you own Wedding        


It is no secret that I love to make and create! So it was only natural to apply this passion to our wedding. Creating our own DIY wedding was a great experience and certainly challenging. There are a heap of advantages to having a DIY wedding 

 1. Budget - For those of us who do not want to spend our life's saving on a wedding, a DIY wedding can be super cost efficient. We saved a heap of money!
2. Unique - It gives you the opportunity to have a unique wedding with a heap of personality.
3.  Pride and Experience - It was a great experience and I am proud to have pulled it off!


So, in the coming month I will be sharing my experiences and tips! 
Today I am starting with collecting inspiration and deciding on themes





The following list/reminders helped with the decision making process throughout the DIY wedding planning and reduced headaches


- How much are you going to DIY -
I would have loved to DIY everything, but that was unrealistic for me. (e.g. I wanted to make our cake but it just was not possible, so I had it made in a local bakery, but I made the stand (stay tuned to see our cake!))

- Season - 
Consider the time of the year when planning. Winter, Spring, Summer, Autumn. e.g. you don't want a long sleeved wedding dress in the summer.
(We had our ceremony under this tree in the Australian summer)

- Colour scheme - 
Choose a colour (red, pink, green, yellow etc) or a colour theme (e.g. pastels, bright colours, metallic) 
(We had pastels with peppermint green and apricot as the main colours)


- Formality - 
Formal, semi formal, informal/casual 
(Our wedding was casual, which suited the backyard theme)


 - Venue type -
Garden, Backyard, Beach, Hall, Barn, Restaurant, Country, city, tent
(We had a garden, backyard wedding)


- Add your own personality/quirks - 
Put some effort into injecting your personality into the wedding
e.g. Music, hobbys,interests  
(We took our cameras with us for the photos, I sewed our clothes, had cotton spool name tags, crocheted our cake toppers,created our playlist etc.)


- Inspired by an era - 
Period, Deco, Nouveau, 50's, 60's, 70's, 80's, 90's, modern
(The Art Deco period inspired my dress, jewellery & make up)

- Verbs - 
Choose some relevant descriptive terms:
eclectic, quirky, fancy, elegant, rustic, classic, romantic, party, rock, punk, natural, ecological, ethical, vintage, indie, local, organic 
(I chose these verbs Eclectic/romantic/ethical/local/vintage/party)

- The dreaded budget - 
Every decision needs to be made with the budget in mind! 
(Ours was low!)

- Be practical -
Think practically, try not to get carried away. The reality is that you can not achieve every fantastic thought/inspiration. You have learn to edit and have a relaxed attitude! What ever happens, happens.

-Be organised -
Create a timeline (find some basic one's on the internet), create excel spreadsheets to help you organise, assign roles/jobs for the big day

- Avoid internet saturation and be confident - 
If you are struggling to make decisions, step away from pinterest and wedding blogs. Be confident and trust you decisions



- Think about your journey -
Think about your journey together as a couple and the current stage in your life. How can you incorporate this into you wedding. 
(We had a photo wall which reflected our journey and the people in our lives. The music was also a reflection of our journey together)

- Mood board and keep on track -
Use all of this information to create a mood board and keep checking back at your theme ideas. It is easy to get carried away. Make a decision and stick with it. Or have a solid reason to change!




Posts still to come...
- How to set up your garden/backyard wedding
- Making your wedding dress
- Making the grooms suit
-  Crocheting the cake toppers
- Making the bridesmaid dress
- Creating your invites
- Choosing your flowers
- Choosing your jewellery
-Choosing a phtographer

          Theory Jobs 2017        
In the fall we point to theory jobs, in the spring we see who got them. Like last year and years past I created a fully editable Google Spreadsheet to crowd source who is going where. Ground rules:
  • I set up separate sheets for faculty, industry and postdoc/visitors.
  • People should be connected to theoretical computer science, broadly defined.
  • Only add jobs that you are absolutely sure have been offered and accepted. This is not the place for speculation and rumors.
  • You are welcome to add yourself, or people your department has hired.
This document will continue to grow as more jobs settle. So check it often.

Edit
           [ruby][googledrive] Google DriveにアクセスするRubyライブラリを作りました        

google-drive-ruby - Github

作りました、というか、google-spreadsheet-rubyにスプレッドシート以外のファイルをアップロード/ダウンロードする機能を加えて改名しました。

Google Drive Appを作るライブラリではありません。Google Driveにアクセスする2種類のライブラリの違いについてはこちら。

インストール:

$ sudo gem install google_drive

使用例:

require "rubygems"
require "google_drive"

# Logs in.
# You can also use OAuth. See document of
# GoogleDrive.login_with_oauth for details.
session = GoogleDrive.login("username@gmail.com", "mypassword")

# Gets list of remote files.
for file in session.files
  p file.title
end

# Uploads a local file.
session.upload_from_file("/path/to/hello.txt", "hello.txt", :convert => false)

# Downloads to a local file.
file = session.file_by_title("hello.txt")
file.download_to_file("/path/to/hello.txt")

# Updates content of the remote file.
file.update_from_file("/path/to/hello.txt")

google-spreadsheet-rubyは

require "google_drive"
GoogleSpreadsheet = GoogleDrive

するだけのライブラリになる予定です。


          RedPenで技術文書の誤りを指摘してもらおう        

自然言語の誤りを指摘してくれるRedPenを手元で使えるようにしてみました、という記事です。気が向いたので、色々書いてみました。

エンジニアであっても意外と文書を書いたり見たりする機会が多い

エンジニアとしてはてなに入社後、コードレビューをする機会はもちろん多いですが、意外と自然言語(私の場合は日本語、英語がメイン)のレビューをする機会も多いことに気が付きました。他人の書いた文書に対するレビューに限らず、自分の書いた文書に対するレビューも含みます。

自然言語も機械が勝手に間違いを指摘して欲しい

プログラムでは各言語のlinterやciのテストを通すことで、レビュー前に単純な誤りに気づくことができます。typoやsyntax errorのようなささいなことを人間がやっていると疲れるので、機械が勝手にやってくれるのはいいですね。自然言語にもlinterっぽいものはいくつかあって、例えばMicrosoftのwordなどはそうでしょう。研究論文を書いてるときに簡単な誤りはwordに投げてざっとチェックして、というのをやっていました。

しかし、技術文書をwordに投げるとうまくいかないことが多いです。私の場合、markdownで文書を書く機会が多いですが、markdownのsyntaxをwordは当然ながら解釈してくれません。markdownのsyntaxに関する赤線なのか、日本語の文法としておかしいことへの赤線なのかを人間が判断していると、これまた消耗します。私はとにかく消耗したくないのです。

自然言語もルールで分かることは機械(RedPen)に指摘してもらう

自然言語もプログラミング言語と同様に文法を持ちますが、最大の違いは曖昧性の有無でしょう。自然言語処理の大部分は曖昧性をいかに扱うかとの戦いといっても過言ではありません。自然言語処理の研究分野でも最近誤り訂正の話題は活発に議論されていて、コンペティションも開かれています。Shared taskで有名なCoNLLでも2013年にテーマになりました。

100%に近い精度を出すことがまだまだ難しいため、このように研究分野になっているわけですが、ルールでも分かる簡単なものは将来と言わず今でも指摘して欲しいです。RedPenはその要望を満たしてくれるソフトウェアの1つです。

文書をコードとして扱う from Recruit Technologies

詳しい機能はサイトを見てもらうといいですが、私がよいと感じたのは以下の部分です。

  • 様々なマークアップ言語に対応している
    • markdownを含め、LaTeXにも対応
  • 日本語や英語など様々な言語に対応
  • 必要ない指摘は設定でオフにできる
    • 「誤りと指摘されたこの箇所は自分は誤りだとは思わない」というのがよくあるパターンですが、うざいと思ったら設定でオフにできます

インストールも簡単でした。

% brew install redpen

予想通り、指摘が多かったため、設定で以下の項目はオフにしました。

  • InvalidSymbol
  • KatakanaEndHyphen
  • EmptySection

指摘例

最近書いたブログの元テキストをRedPenに流してみました。私の場合、特に一文が長い表現を書いてしまう傾向があるのですが、SentenceLengthやCommaNumberで怒ってくれているのはまさにそれですね。

% redpen --format markdown --conf /usr/local/Cellar/redpen/1.8.0/libexec/conf/redpen-conf-ja.xml ~/Dropbox/_posts/2017-03-20-清算用Slack-botを書いた.md 2>/dev/null
2017-03-20-清算用Slack-botを書いた.md:14: ValidationError[SentenceLength], 文長("121")が最大値 "100" を超えています。 at line: 昔のことは忘れてしまうので、清算用のSpreadsheetを作ろうとしましたが、出先で開くのは手間なので、slackからできるといいよねという妻の声がありましたが、外部でちょうどいいサーバーを持っていなかったので、そのときは流れました...。
2017-03-20-清算用Slack-botを書いた.md:14: ValidationError[CommaNumber], カンマの数 (5) が最大の "3" を超えています。 at line: 昔のことは忘れてしまうので、清算用のSpreadsheetを作ろうとしましたが、出先で開くのは手間なので、slackからできる といいよねという妻の声がありましたが、外部でちょうどいいサーバーを持っていなかったので、そのときは流れました...。
2017-03-20-清算用Slack-botを書いた.md:14: ValidationError[DoubledConjunctiveParticleGa], 一文に逆説の接続助詞 "が" が複数回使用されています。 at line: 昔のことは忘れてしまうので、清算用のSpreadsheetを作ろうとしましたが、出先で開くのは手 間なので、slackからできるといいよねという妻の声がありましたが、外部でちょうどいいサーバーを持っていなかったので、そのときは流れました...。
2017-03-20-清算用Slack-botを書いた.md:24: ValidationError[InvalidExpression], 不正な表現 "俺" がみつかりました。 at line: 俺は好きなエディタで書きたいし、gitでコード管理したいし、何ならTypeScriptで書きたいんじゃー、と思っていたところでいいエントリを見つけました。
2017-03-20-清算用Slack-botを書いた.md:29: ValidationError[InvalidExpression], 不正な表現 "最高" がみつかりました。 at line: 最高です、IDEAでリファクタリングや定義元に戻るとかもできるようになったので完璧です。

EmacsからRedPenを使う

Emacsで技術文書を書くことが多いので、Eamcs内からRedPenを叩けるとよさそうですが、すでに作っている方がいらっしゃいました。

作者の方がインストールの手間が省けるように、ディフォルトでは(作者が立てた)Herokuサーバーを見に行くようになっています。私の場合、オフラインで動いて欲しいことや社外に出すとまずい文書でも動いて欲しいということがあるため、設定でローカルのRedPenを見に行くように変更しました。

(el-get-bundle karronoli/redpen-paragraph.el)

(define-key markdown-mode-map (kbd "C-c C-r") 'redpen-paragraph)

(defvar redpen-commands
  '("redpen --format markdown --result-format json2 --conf /usr/local/Cellar/redpen/1.8.0/libexec/conf/redpen-conf-en.xml %s 2>/dev/null"
    "redpen --format markdown --result-format json2 --conf /usr/local/Cellar/redpen/1.8.0/libexec/conf/redpen-conf-ja.xml %s 2>/dev/null"))

(defvar redpen-paragraph-force-reading-whole t)

これでC-c C-rで校正の結果が出てきて、編集すべき箇所にすぐに飛べるようになったので、快適になりました。

まとめ

機械で指摘してくれる箇所は機械にやってもらって、人間はもっと本質的なところを考える時間を増やしていきたいですね。

理科系の作文技術 (中公新書 (624))

理科系の作文技術 (中公新書 (624))


          å®¶åº­ç”¨ã®æ¸…算君(Slack bot)をGoogle App Scriptで書いた + TypeScript化した        

こんな話です。大体は先人の真似っこ。

  • 家庭用の清算をGoogle Spreadsheetでやりたい
    • Spreadsheetを出先で開くのは手間なので、slackからできるといいよね
  • slackからSpreadsheetに書き込んでくれる君を作ろう
    • 外のサーバーでやるのも面倒だからGoogle App Scriptでやろう
  • Script Editorで書くのダルい、自分の好きなエディタで書きたい
    • ローカルで書けるようにして、ついでにTypeScript化した

TypeScriptの簡単な練習をしたい感じだったので、ちょうどいい練習になった気がします。

家庭用の清算をGoogle Spreadsheetでやりたい

現在、妻と家賃用口座以外は財布が別なので、定期的に清算をしています。昔のことは忘れてしまうので、清算用のSpreadsheetを作りました。出先で開くのは手間なので、slackからできるといいよねという妻の声がありました。しかし、外部でちょうどいいサーバーを持っていなかったので、一旦話は流れてしまいました…。

slackからSpreadsheetに書き込んでくれる君を作ろう

最近チームでid:daiksyさんがslackでKPTをSpreadsheetに入力できるkpt君(bot)を作っていました。これはGoogle App Scriptで動いているので、外部に自前でサーバーを持つ必要がありません。

これいいじゃん、ということで清算用botも真似させてもらって作りました。1時間もかからずできて快適。KUMAOはうちのぬいぐるみの名前です。

f:id:syou6162:20170320211958p:plain f:id:syou6162:20170320212003p:plain

Script Editorで書くのダルい、自分の好きなエディタで書きたい

簡単にできるようになったのはいいのですが、Google App Scriptのscript editorで書くの、めっちゃダルいです。俺は好きなエディタで書きたいし、gitでコード管理したいし、何ならTypeScriptで書きたいんじゃー、と思っていたところでいいエントリを見つけました。

最高です、IDEAでリファクタリングや定義元に戻るとかもできるようになったので完璧です。できたものも公開しておきます。


          nasneの残量やGoogle Analyticsの情報をMackerelに監視させよう        

Mackerelアドベントカレンダー6日目です。昨日はid:buty4649さんによるmackerelメタデータでパッケージ一覧を管理するCLIツールを作ったでした。

こんにちは、株式会社はてなでアプリケーションエンジニアをやっているid:syou6162です。10月にMackerelチームにjoinしました。今回は生活に関するあれこれをMackerelに監視させると便利!という軽い話を紹介します。

nasneの残量やjenkinsのjobの成功/失敗数をカスタムメトリックとして投稿/監視

私はライフログを集めるのが好きで、twitterの発言、歩いた歩数、預金残高などなど取れそうな情報をひたすら自宅のElasticSearchに保存しています。そのデータをkibanaで可視化したり検索すると、月や年単位での振り返りが簡単にできます。

ElasticSearchはこのように可視化や検索は得意なのですが、一点問題があって、数値がある閾値を越えたときの通知ができないという問題がありました。自分に関する多様なデータを貯め込んでいると通知して欲しくなる状況は結構あり、例えば「nasneの残量が減って、録画できなくない状況になる前に通知して欲しい」などの状況があります。最初はslackへの通知を自分で書いていたのですが、真面目に書くと割と面倒になってきたので(例えば60分おきの通知の再送など)、通知部分をmackerelに任せることにしました。

方法は色々ありますが、今回はホストのカスタムメトリックとして投稿するやり方でやってみます。

mackerel.io

agentの設定ファイルに指定されたフォーマットを吐くようなコマンド(スクリプト)を用意するだけなので簡単ですね。nasneの容量の場合、スクリプトを用意するまでもなく、こんな感じでワンライナーで投稿することができます。

[plugin.metrics.nasne]
command = "echo \"nasne.volumes.freeVolumeSizeRatio\t$(curl -s 'http://10.X.X.X:64210/status/HDDInfoGet?id=0' | /usr/local/bin/jq -r '.HDD | .freeVolumeSize / .totalVolumeSize')\t`date -u +%s`\""

これで1分置きにnasneの残り容量が分かり、監視設定をすれば容量が足りなくて「逃げるは恥だが役に立つ」が予約できなかった、という事態を防ぐことができます。slackへの通知も簡単にできるので、家庭内のslackに通知すればITにそれほど詳しくない妻も簡単に状況が分かります。実際の我が家のnasne残量の推移はこんな感じになっています(2-3日に1回は通知がくるような残量を推移しています...)。

f:id:syou6162:20161129195440p:plain

ライフログの取得にはjenkinsを活用しているのですが、ジョブの数が50個を越えてきたため、個別に通知されると見るのも大変です。そこで、成功/失敗の数もmackerelのカスタムメトリックで監視させています。nasneのときと同様にagentの設定ファイルに簡単なワンライナーを書けばOKです。

[plugin.metrics.jenkins_jobs_red]
command = "echo \"jenkins.jobs.red\t`curl -s http://localhost:3000/api/json | /usr/local/bin/jq -r '.jobs[] | .color' | grep -c red`\t`date -u +%s`\""

[plugin.metrics.jenkins_jobs_blue]
command = "echo \"jenkins.jobs.blue\t`curl -s http://localhost:3000/api/json | /usr/local/bin/jq -r '.jobs[] | .color' | grep -c blue`\t`date -u +%s`\""

成功/失敗の数のグラフは下の図で、安定していますが、瞬間停電でjenkinsを走らせている自宅macが落ちてしまったときにすぐ気づくことができ、長期間ライフログの失敗に取得していた、という事態を防ぐことができました(connectivityの通知でも気づくことができます)。

f:id:syou6162:20161129200442p:plain

Google Analyticsのユーザー数をサービスメトリックとして投稿/監視

Google Analyticsはリアルタイムに現在の訪問ユーザー数を知ることができるので見ているだけでも楽しいですが、「頻繁に見ていては仕事にならん、炎上しそうになってる(?)とき教えてくれ!」というのができると楽しそうと思ってやってみました(完全にネタです)。Google Analyticsから情報を取得するのは面倒そうですが、Google SpreadsheetのアドオンにGoogle Analyticsの情報を取得するものがあったのでそれを使います。

Google SpreadsheetにはGoogle App Script(javascriptと似たようなもの)を定期的に動かすことができる機能があるので、それを使って最近の訪問者数をmackerelに投稿させましょう。スクリプトはこんな感じです(ネタなので、かなり適当に書いています...)。

function postMackerelServiceMetric(apiKey, serviceName, payload) {
  return UrlFetchApp.fetch(
    "https://mackerel.io/api/v0/services/" + serviceName + "/tsdb",
    {
    "contentType" : "application/json",
    "method" : "post",
    "headers" : {
      "X-Api-Key" : apiKey
    },
    "payload" : JSON.stringify(payload),
    "muteHttpExceptions" : true
    }
  );
}

var epoch = Date.now() / 1000;
var mackerelApiKey = "YOUR_MACKEREL_API_KEY";
var serviceName = "Analytics";

function run(term, sheetName) {
  var ss =SpreadsheetApp.getActiveSpreadsheet()
  var mySheet = ss.getSheetByName(sheetName)
  SpreadsheetApp.setActiveSheet(mySheet);
  
  var row = mySheet.getLastRow();
  var users = mySheet.getRange(row, 2).getValue();
  var sessions = mySheet.getRange(row, 3).getValue();
  var pageviews = mySheet.getRange(row, 4).getValue();

  var payload = [
    {"name" : "analytics." + term + ".users", "time" : epoch, "value" : users},
    {"name" : "analytics." + term + ".sessions", "time" : epoch, "value" : sessions},
    {"name" : "analytics." + term + ".pageviews", "time" : epoch, "value" : pageviews}
  ];
  var mackerelPostRes = postMackerelServiceMetric(mackerelApiKey, serviceName, payload);
  Logger.log(mackerelPostRes)
}

function main_day() {
  run("day", "Google Analytics(Day)")
}

function main_hour() {
  run("hour", "Google Analytics(Hour)")
}

Google Spreadsheetで15分に一度くらい実行させておけばまぁよいでしょう。こんな感じで、炎上してそうかどうかも簡単に通知させることができます(実際はRTされてバズったとかですが)。

f:id:syou6162:20161129202636p:plain

機械学習の分類器の性能をサービスメトリックとして投稿/監視

ネタ要素を書いたので、最後は真面目なネタを書きます。最近は機械学習をサービスに導入する企業も大分増えてきたので、この分類器の性能はいい/悪いといった情報がどんどん流れてきます。Webサイトの訪問者数の増加によってサーバーのCPU率が上がっていないかどうかを(mackerelで!)監視したいのと同様に、機械学習の分類器(回帰も)の性能も刻々と追加/修正される教師データやチューニングの状況によって変化するため、これもmackerelで監視できるとうれしいでしょう。早速、サービスメトリックに投稿するようにしましょう。PerlやRuby、Goなど様々な言語からサービスメトリックの情報を簡単に取得/追加/更新/削除することができます。

例えばPerlで機械学習の分類器のF値などを投稿するスクリプトはこんな雰囲気です。簡単ですね。

use JSON::Types;
use WebService::Mackerel;

my $mackerel = WebService::Mackerel->new(
    api_key => $ENV{ML_STUDY_MACKEREL_API_KEY},
    service_name => 'ML-Study',
);

my $res = $mackerel->post_service_metrics(
    [
        {
            "name"  => "evaluation.precision",
            "time"  => time,
            "value" => JSON::Types::number $metrics->{precision},
        },
        {
            "name"  => "evaluation.recall",
            "time"  => time,
            "value" => JSON::Types::number $metrics->{recall},
        },
        {
            "name"  => "evaluation.f_value",
            "time"  => time,
            "value" => JSON::Types::number $metrics->{f_value},
        },
    ]
);

社内ではまだ導入していませんが、勉強会で雑談するときのエントリを推薦するbotを作ったので、この精度を投稿させています。こんな様子です。

f:id:syou6162:20161129204759p:plain

アノテーションの基準を特に決めていなかったために、教師データを増やしていっても精度がかえって落ちていっているという悲しい状況をmackerelは教えてくれます。

まとめ

今回は大まかに3つほどmackerelに様々なメトリックを投稿/監視させる方法を書きました。10行程度の簡単なスクリプト(あるいはワンライナー)、サーバーがなくてもGoogle Spreadsheetからメトリック投稿ができる、ということが分かっていただけたかと思います。このようにお手軽なところからmackerelを使っていただき、mackerelの本業であるサーバーの状態監視も体験してもらえるとうれしいです。

明日はhtnosmさんです。

参考情報

blog.sushi.money blog.a-know.me astj.hatenablog.com


          [news] IT Spending Trends        
Tuesday, July 6, 2004
Dateline: China
 
A quick recap on IT spending trends from three recently published Smith Barney surveys.  The three reports are the May and June editions of their CIO Vendor Preference Survey and the 6 June issue of softwareWEEK.  Tom Berquist, my favorite i-banking analyst, was the lead for all three reports.  I have a backlog of blogs to write, so I'll use as many quotes as possible and add context where necessary.  (I'm mostly extracting from my smartphone bookmarks for these reports.  Warning:  I may have coded the May and June issues incorrectly, but the quotes are correct.)  NOTE:  Highlighted items (e.g., items in bold, like this sentence) are MY emphasis.  Items in red are my commentary.
 
Starting with the Survey editions, "(t)he strongest areas of spending appear to be software (apps, security, storage, and database) and network equipment/apps (Gigabit Ethernet, WLAN, VPNs)" and regarding software, "larger and more well known vendors continue to dominate the list in each category with vendors such as Microsoft, SAP, IBM, Veritas, Symantec and Computer Associates getting significantly more mentions in each of their groups than the remaining vendors did."  However, the report admits that their sample group might be biased.  Yes, vendors matter -- and so do vendor partnering strategies.  However, I'm a bit skeptical about CA and I don't particular care very much for Veritas or Symantec.  Not my part of the universe.
 
"Applications again stand out as a clear area of strength."  "Within applications, Enterprise Resource Planning (ERP), Supply Chain Management (SCM), Customer Relationship Management (CRM) and Business Intelligence (BI) all showed extremely well ..."  Well, this is the first sign that a recovery may be in the making for SCM.  However, I'd emphasize BI and ERP, followed by CRM; don't count on a lot happening in the SCM space just yet.  Some other key surveys do NOT validate that SCM is in recovery.  "In terms of specific vendors, Microsoft, Symantec, Veritas, SAP, and Adobe were the top beneficiaries of CIOs intentions to increase spending."  The report continues that only SAP showed statistically significant results, both in ERP and SCM.  "Results were more mixed for best-of-breed vendors in this area, suggesting that horizontal applications vendors are having a tough time competing with the large ERP vendors even as vertically-focused vendors continue to have some measure of success on this front."  For the more adventurous SIs in China, SAP presents a lot of opportunities.  Tread carefully, though.  And "Adobe's enterprise strategy appears to be gaining momentum.  Adobe was a clear standout in content management ..."  "Survey results were also positive (though somewhat less so) for other leading content management players, notably Microsoft and IBM."  Another "win" for Microsoft.  Funny that none of the traditionally leading content management players were mentioned.  A take on Linux:  "Linux continues to garner mind share, but large enterprises remain the main adopter.  Interestingly, nearly 83% of our respondents stated that they were not currently moving any applications to Linux.  Of the 17% that said they were moving applications to Linux, only one company under $1.0 billion in revenue was making the transition to Linux confirming our views that Linux is primarily being used by large companies to shift Unix applications to Linux on Intel."
 
"Among CIOs who indicated a higher level of consulting spend, IBM was the clear winner, followed by Accenture as a distant second.  Unisys was also mentioned as a vendor being considered, but it was a distant third.  However, we note that Unisys being mentioned ahead of a pure-play consultant like BearingPoint (a low number of mentions, which included mentions of decreased spending) or EDS is positive, given that Unisys chooses to focus in 2 specific verticals, including one-public sector-that was not in the survey."  "Over two-thirds of CIOs indicated that they do not use IT outsourcers.  Most of the rest said they were unlikely to change the level of outsourcing spend.  IBM, ACS and CSC were the only vendors explicitly mentioned as likely to get more outsourcing business."  The "two-thirds" figure will likely change in favor of outsourcing.  This trend is fairly clear.  See a BCG report at http://tinyurl.com/2muy8 , although the report takes a relatively broad perspective.
 
From softwareWEEK, "(t)he CIOs were also very focused on rapid 'time to market' with purchases.  None were interested in starting projects that would take greater than 2 quarters to complete."  "This requirement was not a 'payback' requirement, but rather an implementation time frame requirement.  The CIOs did recognize that payback times could be longer, though the payback times on IT utility spending were much shorter than on applications or emerging area spending."
 
"In terms of spending, the CIOs all used a similar methodology for making decisions that essentially divides their IT spending into one of three categories: 1) sustained spending on their 'IT utility' (i.e., infrastructure such as network equipment, servers, storage, databases, etc.); 2) new project spending on applications (business intelligence, portals, CRM, etc.); and 3) investment spending on select emerging areas (grid/utility computing, identity management, collaboration, etc.)  It was pretty obvious that the CIOs recognized that business unit managers were more interested in spending on new applications/emerging areas than on the IT utility ..."  "(S)ome of the CIOs were experimenting with grid/utility computing initiatives to try to increase their utilization of storage/servers and reduce the amount of new equipment to be purchased.  In one example, a CIO showed their storage/server utilization around the world and many regions were in the 50% or worse bucket for average utilization.  Their goal was to use grid computing architectures and storage area networks (along with faster communication links) to better share the pool of resources."  Yes, this is it!!  Take this to heart!!  If you think grid and utility computing are Star Trek stuff, think again.
 
"In terms of new projects, the CIOs mentioned they were spending on business intelligence, portal/self-service applications, CRM, and collaboration.  Collaboration was a heated discussion, with all CIOs commenting that this was a big problem for them and there was no clear solution on the market.  While it wasn't completely clear to the audience what the CIOs were looking for in a collaboration solution, the elements that were described included: more intelligent email, corporate instant messaging, web conferencing, integrated voice over IP with instant messaging (so that a conversation could quickly shift from typing to talking), and collaborative document editing (spreadsheets, presentations, publications, etc.).  Within the business intelligence arena, business activity monitoring was discussed as was building of enterprise data warehouses/data marts.  The portal/self-service applications being built or deployed were primarily for customer and employee self-service (remote access to email, applications, and files was a big deal for all of the companies).  On the CRM front, the discussion from one CIO was around their need to increase revenues and manage channel conflict better."  [I'll be posting to this blog a bit more about collaboration opportunities over the next week.]
 
"While vendors were not discussed in any detail during the panel, the CIOs did say that they remain open to working with smaller vendors (public and private) as long as they have plenty of relevant references (in their industry, particularly with close competitors) and they offer a compelling value proposition versus larger vendors.  One CIO stated that they get called by 20 startups a week to sell products to them, but most of them cannot articulate the value proposition of their product.  Nonetheless, the CIO does take 5 meetings a month from startups because some of them are working on things that are interesting to the business."
 
Whew ...  Lots of good materials.  To reiterate, all highlighted items are my emphasis.  Bottom line:  The market is heating up.  Get your ISV relationships in place.  Pick your verticals (see the "Tidbit on Microsoft" which follows).  Pick your apps -- and the apps I like the best are content management and BI, although ERP is looking good, too.  Collaboration can be a major source of revenue if the SI can provide a truly effective solution.
 
Tidbits on Microsoft
 
A quick update on some happenings in the Redmond universe.  (See http://tinyurl.com/36xgu ; the article is titled, "Microsoft focuses on its enterprise-applications business".)  First, app areas that are of particular interest to MS include those for manufacturing and life sciences.  So, how about a MS build-to-their-stack strategy focused on either of these two verticals?  Second, MS is moving beyond purely horizontal offerings to very specific functionality.  Their Encore acquisition is an example of MS moving in this direction.  Finally, new releases of all four of Microsoft's ERP product lines are due for this year.  Not surprisingly, MBS marketing is up 20% from FY04.  Hmmm ... ERP spending intentions are strong and MS is a key player in this space -- with several updated offerings scheduled for release this year.  Another opportunity?
 
Tidbits on Infosys
 
Infosys formally enters the IT strategy consulting biz.  (See http://tinyurl.com/2xxlo .)  Yes, it was inevitable.  In April Infosys Consulting, Inc. was formed and, "(i)t's no secret that the winning model will be high-end business consulting combined with high-quality, low-cost technology delivery done offshore," according to Stephen Pratt, the head of Infosys' consulting unit.  The Infosys Consulting unit now has 150 employees in the States and plans to expand to 500 within three years.  Note to SIs in China:  You need more -- a lot more -- IT strategy types  And you need people in the States (at least on an "as needed" basis) in order to capture -- and serve -- new accounts.
 
Cheers,
 
David Scott Lewis
President & Principal Analyst
IT E-Strategies, Inc.
Menlo Park, CA & Qingdao, China
http://www.itestrategies.com (current blog postings optimized for MSIE6.x)
http://tinyurl.com/2r3pa (access to blog content archives in China)
http://tinyurl.com/2azkh (current blog postings for viewing in other browsers and for access to blog content archives in the US & ROW)
http://tinyurl.com/2hg2e (AvantGo channel)
 
To automatically subscribe click on http://tinyurl.com/388yf .
 

          Questions!        
I feel like my readers (you!) don't really have a good idea of who I am, which is mostly my fault. As I prepare for my major comeback (of sorts), I thought I would give you opportunities to get to know me! I plan on having a week where I put up information about me, my blog, and book reviews. Oh and also CONTESTS. :D BUT first, today, I want you to ask me all the questions you have for me. You can use the form below, e-mail me, or comment. I'll answer all of them in a post on "The Week," as long as they are appropriate. I don't know when the actual week will be, but I'm leaning towards Aug. 23-30. I think that will be the soonest I will be able to.

Oh, and asking my questions MAY get you bonus entries on a future giveaway. *hint* *hint*

Ask away!


          To Kill a Mockingbird 50th Anniversary/Giveaway        
"Shoot all the bluejays you want, if you can hit 'em, but remember it's a sin to kill a mockingbird."

A lawyer's advice to his children as he defends the real mockingbird of Harper Lee's classic novel—a black man charged with the rape of a white girl. Through the young eyes of Scout and Jem Finch, Harper Lee explores with rich humor and unswerving honesty the irrationality of adult attitudes toward race and class in the Deep South of the 1930s. The conscience of a town steeped in prejudice, violence, and hypocrisy is pricked by the stamina and quiet heroism of one man's struggle for justice—but the weight of history will only tolerate so much.

The 50th anniversary edition of one of the best-loved books in American history: Harper Lee’s Pulitzer Prize-winning classic To Kill a Mockingbird. Featuring some of the most memorable characters in literary history—attorney Atticus Finch, his children Scout and Jem, and of course Boo Radley—To Kill a Mockingbird is the indelible story of race, class, and growing up in the Deep South of the 1930s.

To commemorate the Golden Anniversary of the “Best Novel of the 20th Century” (Library Journal poll of American librarians), filmmaker Mary Murphy has interviewed prominent figures—including Oprah, Anna Quindlen, and Tom Brokaw—on how the book has impacted their lives, and compiled the interviews in Scout, Atticus, and Boo: the perfect companion to one of the most important American books of the 20th Century. Additionally, Scout, Atticus, and Boo features a foreword from acclaimed writer Wally Lamb.



July 11th marks the 50th anniversary of To Kill a Mockingbird by Harper Lee. To celebrate the occasion, I was given the opportunity to give away 2 sets of To Kill a Mockingbird and a book companion, Scout, Atticus, and Boo: A Celebration of 50 Years of To Kill A Mockingbird. Also, many bookstores across the country are throwing parties for the book. You can find dates and participating stores here: http://tokillamockingbird50year.com/.

To enter this giveaway, complete the form below! USA only, and the deadline is July 17 at 12 p.m. :)


          Trips Giveaway        
Since this is my last summer before I start college (at Georgia Tech!), I wanted to make sure it was amazing. Over the weekend I went to Mississippi. I had a great time, but you know, my favorite thing about going on trips is coming back! Nothing is better than returning to my bed!

Speaking of summer, I wanted to know how your summer is! Not too mention, I have a great opportunity for all of my readers! Meaning, giveaway time. Just fill out the form (link below) for a chance to win a $40 gift card, which you can use at any CSN Store! There may be shipping charges or, in the case of Canadian addresses, international fees for certain products. Giveaway ends July 4th.

FORM
          23 Things        
NOTE: THIS PROGRAM IS ARCHIVED FOR REFERENCE ONLY.SBISD has NO plans to have a 2011 Summer session.


Listed below are 23 Things (or activities) that you can do on the web to explore and expand your knowledge of the Internet and Web 2.0 tools.

Summer 2010 SessionPublish Post
Start Date: Monday, June 7, 2010
Blog posts must be dated on or after this date.

Completion Deadline: Monday, August 9, 2010
All 23 Things must be be completed by 11:59 pm. No partial credit will be given.











Spring Branch employees only -- If finished by August 9, 2010 Spring Branch employees will receive 18 hours off-contract PDLC credit. The PDLC course # is 1671.13464












Out of district players -- If finished by August 9, 2010 players from outside SBISD will receive a certificate indicating 18 hours of professional development participation.

For more information, contact vaughn.branom@springbranchisd.com





Face-to-Face (F2F) Recess
These F2F encounters will take place at the Media Center on the SBEC Campus.
Come-&-go informal assistance will be available from 9:00-3:00. Bring your laptop!
Tuesday, June 22nd
Wednesday, July 7th
Thursday, July 15th
Tuesday, July 20th


Ready! Set! Play! Begin here.
(Be sure and click on the hot links. They tell you what to do!)


23 Things*
Week 1: Life Long Learning



Week 2: Blogging
(Remember the weeks are just guidelines and markers for PDLC. You may use your own timeline as long as you finish on time! You do NOT have to have things done by any date EXCEPT the final date -- August 9th)





  • Thing #3: Set up your own blog, create an avatar, and add a post about what you did.


  • Thing #4: Register your blog and begin your Library2Play journey. (To be officially registered you must have posts written for Thing 2 and Thing 3.)


Week 3: Photos & Images





Week 4: RSS & Newsreaders





Week 5: Hodge-Podge









Week 6: Tagging, Folksonomies & Technorati





Week 7: Wikis & Rollyo





Week 8: Online Applications & Tools





Week 9: Videos, Podcasts, & Nings





More Help - FAQs



* Note: This project is loosely based upon the website 43Things (which allows you to set and track personal goals) and the Stephen Abram article titled 43 Things I (or You) might want to do this year (Information Outlook - Feb 2006). Our adaptation comes largely from School Library Learning 2.0 An online learning program for CSLA members and friends and Learning 2.0 Through Play by Mary Woodard Library Director at Mesquite ISD, Texas.


          Thing #7: Cool Google Tools        

Google is the most famous search engine on the web these days, with the very name becoming a verb in our language. Here's Wikipedia's entry on this phenomenon:


The verb to google (also spelled to Google) refers to using the Google search engine to obtain information on the Web. For example, "Mary googled for recipes." A neologism arising from the popularity and dominance[1] of the eponymous search engine, the American Dialect Society chose it as the "most useful word of 2002." [2] It was officially added to the Oxford English Dictionary on June 15, 2006,[3] and to the 11th edition of the Merriam-Webster Collegiate Dictionary in July 2006.[4] The first recorded usage of google used as a verb was on July 8, 1998, by Larry Page himself, who wrote on a mailing list: "Have fun and keep googling!"[5]


Although we generally equate Google with web searching, that's not what this "thing" is about. Google also has a variety of free web tools that can be particularly useful in education. Some of these are:


Google Alerts - will e-mail the news to you as it happens. Just enter a search term (educational term, news topic, person, event, sports team, etc.) that you would like to keep tabs on. Whenever that topic appears in a news item or on the web, Google Alerts will send you an e-mail.


Google Calendar - lets you organize your schedule and share it with family, friends, teachers, students...


iGoogle - gives you a customizable home page where you can add links, news feeds, gadgets, etc. (Be sure and look at the gadgets - these are really fun!). Students can use iGoogle as their home page. They can have tabs for separate subjects or projects. They can set up gadgets to deliver information on topics, etc...


Picasa Web Albums - similar to Flickr; Google's version of photo sharing.


Google Scholar - Google Scholar provides a simple way to broadly search for scholarly literature. From one place, you can search across many disciplines and sources: peer-reviewed papers, theses, books, abstracts and articles, from academic publishers, professional societies, pre-print repositories, universities and other scholarly organizations.


Google Advanced Search – Allows you to search by file format. In other words, if you want a PowerPoint already created about a particular topic, you choose the PowerPoint (.ppt) and search for your topic.


Google Earth - Google Earth combines the power of Google Search with satellite imagery, maps, terrain and 3D buildings to put the world's geographic information at your fingertips. (SBISD employees, Google Earth is loaded on the Teacher roll-out laptops and maybe other computers, double check before you download).

and Google Docs - Look at all the new things you can do with plain ol' text, spreadsheet and presentation items!

And more! From the Google Search page, click on the "more" pull-down at the top middle of the screen. Then click on "even more" at the bottom of the pull-down list.

Discovery Resources:

Discovery Exercise


After looking at each of Google tools, choose two of them to explore further. Try setting up an alert, calendar, notebook, or iGoogle page and using it. If sharing is an option for the tools you choose, make them public.


Blog about your experience with both tools and include a link (if you make it public) to your creation. Be sure and include possible educational uses.


          GenScriber        

transcription editor for census records, church records, birth, marriage, baptisms, burials, index records etc

GenScriber is a transcription editor for census records, church records, birth, marriage, baptisms, burials, index records etc.
Please note: GenScriber does NOT convert images into text. It is NOT OCR software.

GenScriber is designed to be intuitive and easy to use. The interface is comprised of several resizable windows within a single main window. A register image can be viewed in the top window while data is input in the bottom window.

The data input area uses a spreadsheet style grid, but GenScriber is not a spreadsheet.

GenScriber is a stable, non-volatile data input application, designed for a specific purpose.
The problems associated with using spreadsheets for genealogical data input do not apply here. All cell inputs are alphanumeric. No assumptions are made about the data type. Dates and values are not automatically modified to some alien value you didn't want. Unless you specify a special action on a column, all data input remains exactly as you entered it.

GenScriber is free for private and none commercial use.

It requires no installation. Versions for Linux and Windows are currently available.
          Democoding, tools coding and coding scattering        
Not so much post here for a while... So I'm going to just recap some of the coding work I have done so far... you will notice that It's going in lots of direction, depending on opportunities, ideas, sometimes not related to democoding at all... not really ideal when you want to release something! ;)

So, here are some directions I have been working so far...


C# and XNA

I tried to work more with C#, XNA... looking for an opportunity to code a demo in C#... I even started a post about it few months ago, but leaving it in a draft state. XNA is really great, but I had some bad experience with it... I was able to use it without requiring a full install but while playing with model loading, I had a weird bug called the black model bug. Anyway, I might come back to C# for DirectX stuff... SlimDx is for example really helpful for that.

A 4k/64k softsynth

I have coded a synth dedicated to 4k/64k coding. Although, right now, I only have the VST and GUI fully working under Renoise.. but not yet the asm 4k player! ;)



The main idea was to build a FM8/DX7 like synth, with exactly the same output quality (excluding some fancy stuff like the arpegiator...). The synth was developed in C# using vstnet, but must be more considered as a prototype under this language... because the asm code generated by the JIT is not really good when it comes to floating point calculation... anyway, It was really good to develop under this platform, being able to prototype the whole thing in few days (and of course, much more days to add rich GUI interaction!).

I still have to add a sound library file manager and the importer for DX7 patch..... Yes, you have read it... my main concern is to provide as much as possible a tons of ready-to-use patches for ulrick (our musician at FRequency)... Decoding the DX7 patch is well known around the net... but the more complex part was to make it decode like the FM8 does... and that was tricky... Right now, every transform functions are in an excel spreadsheet, but I have to code it in C# now!

You may wonder why developing the synth in C# if the main target is to code the player in x86 asm? Well, for practical reasons : I needed to quickly experiment the versatility of the sounds of this synth and I'm much more familiar with .NET winform to easily build some complex GUI. Although, I have done the whole synth with 4k limitation in mind... especially about data representation and complexity of the player routine.

For example, for the 4k mode of this synth, waveforms are strictly restricted to only one : sin! No noise, no sawtooth, no square... what? A synth without those waveform?.... but yeah.... When I looked back at DX7 synth implem, I realized that they were using only a pure "sin"... but with the complex FM routing mechanism + the feedback on the operators, the DX7 is able to produce a large variety of sounds ranging from strings, bells, bass... to drumkits, and so on...

I did also a couple of effects, mainly a versatile variable delay line to implement Chorus/Flanger/Reverb.

So basically, I should end up with a synth with two modes :
- 4k mode : only 6 oscillators per instrument, only sin oscillators, simple ADSR envelope, full FM8 like routing for operators, fixed key scaling/velocity scaling/envelope scaling. Effects per instrument/global with a minimum delay line + optional filters. and last but not least, polyphony : that's probably the thing I miss the most in 4k synth nowadays...
- 64k mode : up to 8 oscillators per instrument, all FM8 oscillators+filters+WaveShaping+RingModulation operators, 64 steps FM8's like envelope, dynamic key scaling/velocity scaling/envelope scaling. More effects, with better quality, 2 effect //+serial line per instrument. Additional effects channel to route instrument to the same effects chain. Modulation matrix.

The 4k mode is in fact restricting the use of the 64k mode, more at the GUI level. I'm currently targeting only the 4k mode, while designing the synth to make it ready to support 64k mode features.

What's next? Well, finish the C# part (file manager and dx7 import) and starting the x86 asm player... I just hope to be under 700 compressed byte for the 4k player (while the 64k mode will be written in C++, with an easier limitation around 5Ko of compressed code) .... but hey, until It's not coded... It's pure speculation!.... And as you can see, the journey is far from finished! ;)

Context modeling Compression update

During this summer, I came back to my compression experiment I did last year... The current status is quite pending... The compressor is quite good, sometimes better than crinkler for 4k... but the prototype of the decompressor (not working, not tested....) is taking more than 100 byte than crinkler... So in the end, I know that I would be off more than 30 to 100 byte compared to crinkler... and this is not motivating me to finish the decompressor and to get it really running.

The basic idea was to take the standard context modeling approach from Matt Mahoney (also known as PAQ compression, Matt did a fantastic job with his research, open source compressor....by the way), using dynamic neural network with an order of 8 (8 byte context history), with the same mask selection approach than crinkler + some new context filtering at the bit level... In the end, the decompressor is using the FPU to decode the whole thing... as it needs ln2() and pow2() functions... So during the summer, I though using another logistic activation function to get rid of the FPU : the standard sigmoid used in the neural network with a base 2 is 1/(1+2^-x)), so I found something similar with y = (x / (1 + |x|) + 1) /2 from David Elliot (some references here). I didn't have any computer at this time to test it, so I spent few days to put some math optimization on it, while calculating the logit function (the inverse of this logistic function).

I came back to home very excited to test this method... but I was really disappointed... the function had a very bad impact on compression ratio by a factor of 20%, in the end, completely useless!

If by next year, I'm not able to release anything from this.... I will put all this work open source, at least for educational purposes... someone will certainly be clever than me on this and tweak the code size down!

SlimDx DirectX wrapper's like in C++

Recall that for the ergon intro, I have been working with a very thin layer around DirectX to wrap enums/interfaces/structures/functions. I did that around D3D10, a bit of D3D11, and a bit of D3D9 (which was the one I used for ergon). The goal was to achieve a DirectX C# like interface in C++. While the code has been coded almost entirely manually, I was wondering If I could not generate It directly from DirectX header files...

So for the last few days, I have been a bit working on this... I'm using boost::wave as the preprocessor library... and I have to admit that the C++ guy from boost lost their mind with templates... It's amazing how they did something simple so complex with templates... I wanted to use this under a C++/Cli managed .NET extension to ease my development in C#, but I end up with a template error at link stage... an incredible error with a line full of concatenated template, even freezing visual studio when I wanted to see the errors in the error list!

Template are really nice, when they are used not too intensively... but when everything is templatized in your code, It's becoming very hard to use fluently a library and It's sometimes impossible to understand the template error, when this error is more than 100 lines full of cascading template types!

Anyway, I was able to plug this boost::wave in a native dll, and calling it from a C# library... next step is to see how much I can get from DirectX header files to extract a form of IDL (Interface Definition Language). If I cannot get something relevant in the next week, I might postpone this task when I won't have anything more important to do! The good thing is for example for D3D11 headers, you can see that those files were auto-generated from a mysterious... d3d11.idl file...used internally at Microsoft (although It would have been easier to get directly this file!)... so It means that the whole header is quite easy to parse, as the syntax is quite systematic.

Ok, this is probably not linked to intros... or probably only for 64k.... and I'm not sure I will be able to finish it (much like rmasm)... And this kind of work is keeping me away from directly working with DirectX, experimenting rendering techniques and so on... Well, I have to admit also that I have been more attracted for the past few years to do some tools to enhance coding productivity (not necessary only mine)... I don't like to do too much things manually.... so everytime there is an opportunity to automatize a process, I can't refrain me to make it automatic! :D


AsmHighlighter and NShader next update

Following my bad appetite for tools, I need to make some update to AsmHighlighter and NShader, to add some missing keywords, patch a bug, support for new VS2010 version... whatever... When you release this kind of open source project, well, you have to maintain them, even if you don't use them too much... because other people are using them, and are asking for improvements... that's the other side of the picture...

So because I have to maintain those 2 projects, and they are in fact sharing logically more than 95% of the same code, I have decided to merge them into a single one... that will be available soon under codeplex as well. That will be easier to maintain, ending with only one project to update.


The main features people are asking is to be able to add some keywords easily and to map file extensions to the syntax highlighting system... So I'm going to generalize the design of the two project to make them more configurable... hopefully, this will cover the main features request...

An application for Windows Phone 7... meh?

Yep... I have to admit that I'm really excited by the upcoming Windows Phone 7 metro interface... I'm quite fed up with my iPhone look and feel... and because the development environment is so easy with C#, I have decided to code an application for it. I'm starting with a chromatic tuner for guitar/piano/violins...etc. and it's working quite well, even if I was able to test it only under the emulator. While developing this application, I have learned some cool things about pitch detection algorithm and so on...

I hope to finish the application around september, and to be able to test it with a real hardware when WP7 will be offcialy launched... and before puting this application on the windows marketplace.

If this is working well, I would study to develop other applications, like porting the softsynth I did in C# to this platform... We will see... and definitely, this last part is completely unrelated to democoding!


What's next?

Well, I have to prioritize my work for the next months:
  1. Merge AsmHighlighter and NShader into a single project.
  2. Play a bit for one week with DirectX headers to see if I could extract some IDL's like information
  3. Finish the 4k mode of the softsynth... and develop the x86 asm player
  4. Finish the WP7 application
I still have also an article to write about ergon's making of, not much to say about it, but It could be interesting to write down on a paper those things....

I need also to work on some new directX effects... I have played a bit with hardware instantiating, compute shaders (with a raymarching with global illumination for a 4k procedural compo that didn't make it to BP2010, because the results were not enough impressive, and too slow to calculate...)... I would really want to explore more about SSAO things with plain polygons... but I didn't take time for that... so yep, practicing more graphics coding should be on my top list... instead of all those time consuming and - sometimes useful - tools!
          Taper        

Shorter runs, more sleep and all the food. 

After 15 weeks of training, I've hit the long-awaited and well-earned taper as the starting line of the Indianapolis Monumental Marathon inches closer.

It's that spot in training, too, like pregnancy when you feel like you have spent forever training but don't feel ready. The end is now far too close and there is so much work that could have been done.

I only hit 40 miles in a week maybe once. I ran a 5-mile tempo during half marathon training this spring; I wonder whether I should have done 6 or 7 miles for the marathon; I don't remember too many mid-week long runs that exceeded 8 miles.

And yet, as I fret, I know that I followed the plan. There are ridiculous smiley face stickers covering each of the runs – the negative split efforts that required patience and control, the intervals that required a disconnect from my brain and quick turnover; the tempos and race pace efforts; the long runs and then long runs with fast miles. I have retired shoes and stink-ridden tanks.

The spreadsheet in my Google Drive reports that I have ran 464.7 miles since I began training on July 4, my first run of this training cycle a 3-miler on an Atlanta hotel treadmill.

My posting about this training cycle has been sporadic at best. I had hopes to deliver recaps every week but barely manged bi-weekly, if that. Usually it would be an indication that my training was suffering somewhere but it hasn't. It's the other things – life, family, work – that have made it hard to share things but running, thankfully, has been a constant in these 15 weeks.

Rather than try to regurgitate the training runs I've missed and bore you with weeks of logs, I thought I'd share some of the recent workouts I've done. A reminder to myself that though part of me wants another four weeks of high mileage, I will hit the starting line strong.

 photo c094cf4c-46cf-4dc7-8553-2c8445be86dc_zpsurvlxxmn.jpg

Negative splits: The plan called for 7-9 miles, blocking the run in three sections – 3, 4 miles; 3, 4 miles; 1 mile. The run came at the tail end of a busy week, and I had to run both my long run and this one over the weekend. Thankfully, I was able to procure company for the Saturday outing. I slowly ran a mile to our meeting place, trying to come in around long run pace. I arrived a bit early and so I continued the slow jog to and from the baseball diamonds. My first two miles were 10:02 and 10:03. But when my friends joined me, the pace quickened. Excited to all see each other, we began talking – always translating to quicker turnover. 9:32, 9:27, 9:16 the splits beeped from my Garmin 230. When the 9:16 came in, I realized that this run would no longer get checked as a long, slow effort but the negative split workout of the week. We ran two 9:15s and then I split to head home. Ready to make that last mile count, to check the last block, I picked it up. 8:29. Not too bad for a week after my 26.2-mile long run at Fort4Fitness. 

Tempo: If there is a run that intimidates me most, it is the tempo. The long, hard sustained effort is, well, just hard. The last big one for Monumental included a warm-up, one mile at race pace (9:09), 4-5 miles at tempo and a cool down. I really wanted to do this run on the treadmill so I could let the machine set the pace and I just had to hold on but a gorgeous fall day couldn't be spent running indoors. My warm-up mile was a little fast (9:29) and as was my race pace (8:48). After that, most of the run was a mental shit storm of self doubt. I had to stop once to tie my shoe and another time just to get my crap together. Still, I was happy to see that the 8 miles were at an 8:42 pace.

The 20-miler: After running a successful 26.2 miles on Oct. 1, I was pretty confident going into my final 20-miler. However, I really had to work for it around mile 13 and miles 17-20, which were on my own, was a battle of my will to get it done and my lack of interest in running by myself. I'm pretty sure I just sat down at one point to tie my shoes. It was too much to bend over, I guess. Or, maybe I was hoping that I'd flop over and fall asleep.


          Materi Rekayasa Perangkat Lunak        

Definisi Perangkat Lunak
Perangkat Lunak adalah kumpulan beberapa perintah komputer yang dieksekusi oleh mesin komputer dalam menjalankan pekerjaannya yaitu memproses informasi.
Namun Perangkat Lunak (Software) tidak sama dengan program komputer. Perangkat lunak tidak hanya mencakup program, tetapi juga semua dokumentasi dan konfigurasi data yang berhubungan, yang diperlukan untuk membuat agar program beroperasi dengan benar.

Definisi Rekayasa Perangkat Lunak
Rekayasa perangkat lunak ialah suatu disiplin ilmu
yang membahas semua aspek produksi perangkat lunak, mulai dari tahap awal yaiut analisa kebutuhan pengguna, menentukan spesifikasi dari kebutuhan pengguna, menentukan spesifikasi dari kebutuhan pengguna, desain, pengkodean, pengujian sampai pemeliharaan sistem setelah digunakan.

Jenis-Jenis Perangkat Lunak
Dilihat dari sudut pandang fungsinya, perangkat lunak dapat dikelompokkan menjadi :
1. Perangkat lunak sistem
Perangkat lunak yang kegunaannya lebih banyak ditujukan untuk operasional komputer.
• sistem operasi
• penerjemah bahasa pemrograman (compiler/interpreter)
2. Perangkat lunak aplikasi
Perangkat lunak yang kegunaannya lebih banyak ditujukan untuk membantu menyelesaikan masalalah-masalah yang dihadapi oleh pemakai.
• program paket yang sudah jadi
• program aplikasi buatan sendiri

Sedangkan dilihat dari aplikasinya, perangkat lunak dibedakan menjadi :
1. Perangkat Lunak Sistem (Sistem Software)
Sekumpulan program yang ditulis untuk kepentingan program lain, contoh editor, driver dan lain-lain
2. Perangkat Lunak Waktu Nyata (Real Time Software)
Perangkat lunak yang digunakan untuk mengukur/menganalisis atau mengontrol proses pemasukan data dari lingkungan luar sampai menghasilkan laporan yang diinginkan
3. Perangkat Lunak Bisnis (Business Software)
Perangkat lunak yang memberikan fasilitas operasi untuk bisnis atau fasilitas pengambilan keputusan manajemen, contoh sistem akuntansi, inventory, payroll dan lain-lain
4. Perangat Lunak Rekayasa dan Sains (Engineering and Scientific Software)
Perangkat lunak yang digunakan di dalam bidang aplikasi teknik dan kerekayasaan
Perangkat lunak jenis ini biasanya berhubungan dengan komputasi data numerik, CAD (Computer Aided Design), simulasi sistem, dan lain-lain.
5. Embedded Software
Perangkat lunak yang digunakan untuk mengontrol suatu produk dan sistem dimana perangkat lunak tersebut disimpan. Biasanya ditempatkan di ROM, contoh Tombol di Microwave Oven
6. Perangkat Lunak Komputer Pribadi (Personal Computer Software)
Banyak digunakan pada aplikasi yang bersifat perorangan, contohnya : pengolah kata, spreadsheet, game, DBMS dan lain-lain.
7. Perangkat Lunak Intelegensia Buatan (Artificial Intelligent Software)
Dibuat dengan menggunakan teknik algoritma non-numerik untuk memecahkan masalah yang kompleks, digunakan dalam bidang aplikasi kecerdasan buatan, contohnya : game, expert sistem, neural network, Turbo Prolog, dan lain-lain.

Model Proses Perangkat Lunak
1.Model aliran kerja (workflow) à menunjukkan kegiatan pada proses bersama dengan input, output, dan ketergantungannya. Merepresentasikan pekerjaan manusia.
2.Model aliran data (data flow) à merepresentasikan proses sebagai suatu set kegiatan yang melakukan transformasi data. Menunjukkan bagaimana input ke proses, misalnya spesifikasi ditransformasi menjadi output, misalnya menjadi desain.
3. Model peran/aksi à merepresentasikan peran orang yang terlibat pada PL dan kegiatan yg menjadi tanggung jawab mereka.

Model Proses Generic
1. Model air terjun (waterfall) à Mengambil kegiatan dasar seperti spesifikasi, pengembangan, validasi, dan evolusi dan merepresentasikannya sebagai fase-fase proses yang berbeda seperti spesifikasi persyaratan, perancangan perangkat lunak, implementasi, pengujian dan seterusnya.
2. Pengembangan evolusioner à Pendekatan ini berhimpitan dengan kegiatan spesifikasi, pengembangan, dan validasi. Sistem awal dikembangkan dengan cepat dari spesifikasi abstrak. Sistem ini kemudian di perbaiki dengan masukan dari pelanggan untuk menghasilkan sistem yang memuaskan kebutuhan pelanggan.
3. Pengembangan Sistem Formal à Pendekatan ini menghasilkan suatu sistem matematis yang formal dan mentransformasikan spesifikasi ini, dengan menggunakan metode matematik menjadi sebuah program.
4. Pengembangan berdasarkan pemakaian ulang (Reusable) à Teknik ini menganggap bahwa bagian-bagian sistem sudah ada. Proses pengembangan sistem terfokus pada pengintegrasian bagian-bagian sistem dan bukan pengembangannya dari awal.

Attribut Perangkat Lunak yang Baik
Perangkat Lunak seharusnya memberikan userkebutuhan fungsionalitas dan kinerja yang :
1. Dapat dipelihara (Maintanability) à PL harus dapat memenuhi perubahan kebutuhan user.
2. Dapat diandalkan (Dependability) à PL harus dapat dipercaya dan tidak menyebabkan kerusakan fisik atau ekonomi jika terjadi kegagalan sistem.
3. Efisien à PL harus efisien dalam penggunaan sumber daya sistem.
4. Kegunaan (Usability) à PL harus dapat dipakai sesuai dengan yang direncanakan.

Proses Perangkat Lunak
Serangkaian kegiatan dan hasil-hasil relevannya yang menghasilkan perangkat lunak à sebagian besar dilakukan oleh perekayasa perangkat lunak. Ada 4 kegiatan/aktivitas pada proses PL :
1. Spesifikikasi Perangkat Lunak à Fungsionalitas perangkat lunak dan batasan kemampuan operasinya harus didefinisikan.
2. Pengembangan Perangkat Lunak à Perangkat lunak yang memenuhi spesifikasi harus di produksi
3. Validasi Perangkat Lunak à Perangkat lunak harus divalidasi untuk menjamin bahwa perangkat lunak melakukan apa yang diinginkan oleh pelanggan.
4. Evolusi Perangkat Lunak à Perangkat lunak harus berkembang untuk memenuhi kebutuhan pelanggan.
  
Metode Rekayasa Perangkat Lunak
Pendekatan-pendekatan terstruktur terhadap pengembangan perangkat lunak mencakup model, notasi, aturan, saran pengembangan sistem (rekomendasi), dan panduan proses.
1. Deskripsi model sistem à Deskripsi model yang harus dikembangkan dan notasi yang digunakan untuk mendefinisikan model-model ini. Ex : model aliran data.
2. Aturan à Batasan yang berlaku bagi model sistem. Ex : Setiap entitas pada model sistem harus memiliki nama yang unik.
3. Rekomendasi à Saran dalam membentuk perancangan yang baik. Ex : Tidak ada objek yang memiliki lebih dari tujuh sub-objek yang berhubungan dengannya.
4. Panduan Proses à Aktifitas yang bisa diikuti untuk mengembangkan model sistem. Ex : Atribut objek harus didokumentasi sebelum mendefinisikan operasi yang berhubungan dengan objek.

 Sumber :
Materi di Perkuliahan

          Assoc. Dir. @ City of Somerville Libraries : City of Somerville Libraries Associate...        
Author: al.milo
Subject: 1036
Posted: 10/Sep/2013 at 5:15pm

City of Somerville
Libraries

Associate Director of Libraries

Apply date: 9/5/13 - 10/5/13



WE SEEK A HIGHLY CREATIVE, TECH SAVVY, ENERGETIC, POSITIVE INDIVIDUAL WHO WANTS TO JOIN A TEAM COMMITTED TO PROVIDING EXCELLENT LIBRARY SERVICES TO THE CITY OF SOMERVILLE.



Under the general direction of the Director, the Associate Director acts as a human resources and organizational development officer for the library system. Additionally, this individual is responsible for developing system-wide fundraising, marketing and communications, and program strategies, directing and working with other library staff to coordinate consistent and effective library services system-wide that are in line with the library’s strategic goals, vision, and mission. As the most senior library staff person next to the Director of Libraries, this position assumes responsibility for all library functions including management, direction, budget, and operations in the absence of the Director. The Associate Director is required to perform all summarily related duties; such as but not limited to:

Budget preparation and maintenance.
Fundraising management.
Training and support to new and existing staff.
Oversees the day-to-day financial reporting, accounts payable and receivables.
Chief procurement officer who researches large purchases and coordinates contract information.
Oversees payroll and attendance for accuracy and completion.
Attends quarterly medical panels and processes medical payment.
Participates in interviewing and hiring of office staff.
Attends Senior Command staff meetings.


Education and Experience:

Masters in Library Science (MLS) and three (3) years of supervisory/administrative experience; or MLS and any equivalent combination of education, training and experience which provides the required knowledge, skills and abilities to perform the essential functions of the job.



Knowledge of principles, practices, materials and current trends and theories in library science, as well as the latest tools and technology.



Must possess data processing skills in the use of personal computers and office software including word processing, data base and spreadsheet applications in support of department operations.



Ability to exercise considerable judgment in dealing effectively with diverse constituencies in a responsive manner.



Ability to resolve conflict situation in a calm and constructive manner.



Ability to set priorities and make effective use of time management.



Ability to develop effective working relationships with department personnel, subordinates, Trustees, City officials and the public.



Ability to express oneself clearly and concisely both in orally and written form.



Ability to forge partnerships and professional relationships.



Ability to manage change.



Must display a high degree of initiative for planning and implementing all programs of service.



Must have a high degree of management skills, analytical abilities, communication skills, as well as leadership abilities, including building shared vision and motivating others to perform to the best of their abilities.



Salary: $65,000 per year; paid weekly at $1,250; plus benefits package



Send resume along with a cover letter to:
City Hall Personnel Office

93 Highland Avenue

Somerville MA 02143

Fax: 617-666-4426

TTY: 1-866-808-4851

Email: employment_opportunities@somervillema.gov












          10 Secrets of becoming a successfl entrepreneur.        

In my one of the post "Top 10 must watch videos for entrepreneurs" i showed videos that make you feel that same spark as those person feel. Now i' telling you 10 secrets of becoming a successful entrepreneur. After a lot of research, expert talks and speech, i came up summing these ideas as follows---

1.You must be passionate about what you are trying to achieve

That means you’re willing to sacrifice a large part of your waking hours to the idea you’ve come up with. Passion will ignite the same intensity in others who join you as you build a team to succeed in this endeavor. And with passion, both your team and your customers are more likely to truly believe in what you are trying to do.

2.Great entrepreneurs focus intensely on an opportunity where others see nothing.

This focus and intensity help eliminate wasted effort and distractions. Most companies die from indigestion rather than starvation, i.e., companies suffer from doing too many things at the same time rather than doing too few things very well. Stay focused on the mission.

3.Success comes only from hard work.

We all know that there is no such thing as overnight success. Behind every overnight success lie years of hard work and sweat. People with luck will tell you there’s no easy way to achieve success--and that luck comes to those who work hard. Successful entrepreneurs always give 100% of their efforts to everything they do. If you know you are giving your best effort, you’ll never have any reason for regrets. Focus on things you can control; stay focused on your efforts, and let the results be what they will be.

4.The road to success is going to be long, so remember to enjoy the journey.

Everyone will teach you to focus on goals, but successful people focus on the journey and celebrate the milestones along the way. Is it worth spending a large part of your life trying to reach the destination if you didn’t enjoy the journey? Won’t the team you attract to join you on your mission also enjoy the journey more? Wouldn’t it be better for all of you to have the time of your life during the journey, even if the destination is never reached?

5.trust your gut instinct more than any spreadsheet.

There are too many variables in the real world that you simply can’t put into a spreadsheet. Spreadsheets spit out results from your inexact assumptions and give you a false sense of security. In most cases, your heart and gut are still your best guide. The human brain works as a binary computer and can analyze only the exact information-based zeros and ones (or black and white). Our heart is more like a chemical computer that uses fuzzy logic to analyze information that can’t be easily defined in zeros and ones. We’ve all had experiences in business where our heart told us something was wrong while our brain was still trying to use logic to figure it all out. Sometimes a faint voice based on instinct resonates far more strongly than overpowering logic.

6.Be flexible but persistent - every entrepreneur has to be agile to perform.

You have to continuously learn and adapt as new information becomes available. At the same time, you have to remain persistent to the cause and mission of your enterprise. That’s where that faint voice becomes so important, especially when it is giving you early warning signals that things are going off track. Successful entrepreneurs find the balance between listening to that voice and staying persistent in driving for success--because sometimes success is waiting right across from the transitional bump that’s disguised as failure.

7.Rely on your team.It's a simple fact: No individual can be good at everything.

Everyone needs people who have complementary sets of skills. Entrepreneurs are an optimistic bunch, and it’s very hard for them to believe that they are not good at certain things. It takes a lot of soul searching to find your own core skills and strengths. After that, find the smartest people you can who complement your strengths. It’s easy to get attracted to people who are like you; the trick is to find people who are not like you but who are good at what they do--and what you can’t do.

8. Execution, execution, execution

Unless you are the smartest person on earth (and who is), it’s likely that many others have thought about doing the same thing you’re trying to do. Success doesn’t necessarily come from breakthrough innovation but from flawless execution. A great strategy alone won’t win a game or a battle; the win comes from basic blocking and tackling. All of us have seen entrepreneurs who waste too much time writing business plans and preparing PowerPoints. I believe that a business plan is too long if it’s more than one page. Besides, things never turn out exactly the way you envisioned them. No matter how much time you spend perfecting the plan, you still have to adapt according to the ground realities. You’re going to learn a lot more useful information from taking action rather than hypothesizing. Remember: Stay flexible, and adapt as new information becomes available.

9.I can't imagine anyone achieving a long term success without having honesty and integrity.

These two qualities need to be at the core of everything we do. Everybody has a conscience, but too many people stop listening to it. There is always that faint voice that warns you when you are not being completely honest or even slightly off track from the path of integrity. Be sure to listen to that voice.

10.Success is a long journey and much more rewarding if you give back.

By the time you get to success, lots of people will have helped you along the way. You’ll learn, as I have, that you rarely get a chance to help the people who helped you, because in most cases, you don’t even know who they were. The only way to pay back the debts we owe is to help people we can help--and hope they will go on to help more people. When we are successful, we draw so much from the community and society that we live in that we should think in terms of how we can help others in return. Sometimes it’s just a matter of being kind to people. Other times, offering a sympathetic ear or a kind word is all that’s needed. It’s our responsibility to do “good” with the resources we have available. 

          Insurance Company Creates Home Disaster        
Insurance Company Creates Home Disaster When Your Insurance Supplier Is In Charge Of Performing Temporary Repairs After A Disaster, You Would NOT EXPECT To Run The Risk Of Losing The Family...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Natural Disasters - Statistics & Facts        
Natural Disasters - Statistics & Facts We are normally focused on informing you, of when disaster is about to arrive and what you should do to reduce the effects. We came across this information,...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Disaster Emergency Preparedness Widgets        
Disaster Emergency Preparedness Widgets Looking For Quick & Easy Access To Disaster Videos For Your Site? One of the quickest / easiest ways to get video links on your site, pointing to Disaster...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Flood Preparation - Home Preparedness Tips        
Flood Preparation - Home Preparedness Tips Preparing your family and home in readiness for flood, is a 4 PART SYSTEM of preparation: Before Flooding.During Flooding.Evacuation If Required.After The...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Home Evacuation Box - Family Disaster Planning        
Home Evacuation Box - Family Disaster Planning What Is A Home Evacuation Box & What Should Be Placed In It? The Home Evacuation Box is a centralised location for all your important personal /...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Home Evacuation Kit - Family Emergency Preparedness        
Home Evacuation Kit - Family Emergency Preparedness A Home Evacuation Kit is an important part of your family's Disaster Preparedness. When creating a DIY Evacuation Kit you will need to include the...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Basic Emergency Kit - Home Disaster Preparation        
Basic Emergency Kit - Home Disaster Preparation If a Major Emergency or event occurs, it is important to have a Basic Home Emergency Kit 'on hand'. The following kit is designed to supply the basic...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Personal Emergency Plan - Home Disaster Preparation        
Personal Emergency Plan - Home Disaster Preparation With the increase and severity of Global Natural Disasters, it is important to make sure that everyone involved in your home, is aware of a...

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Natural Disaster Preparedness - Awareness Campaign        
Are You Looking For Natural Disaster Information In Real-Time? Global Natural Disasters are increasing in both frequency and severity. We are told constantly, that it makes sense to PREPARE FOR LOSS....

Create your DIY Home Inventory quickly, with free lists & spreadsheets www.aussiehomeinventories.com.au

          Give Your B2B Marketing Materials a SMOG Test and Find Out if They Read like Newsweek or The IRS Tax Code (Part Two)        

Overview of Findings from Part One

In Part One , we looked at the benefits page of enterprise software company, IQNavigator of Denver. IQNavigator technology helps large organizations manage all of the services they outsource to other companies.

It being a benefits page, I was stunned to find its Flesch Reading Ease index at zero in Microsoft Word. So, Part One was dedicated to giving IQNavigator a completely fair review. This meant we had to accomplish two things:

  • Ensure Word was accurately assessing the passage
  • Prove the Flesch Reading Ease metric itself was a reasonable measure

We found Word's calculation of Flesch Reading Ease to be reliable because:

  • My spreadsheet calculation, while slightly negative, basically agrees with Word's zero finding
  • The freely downloadable Java Application called Flesh also scored the passage at zero

The Flesch Reading Ease metric proved to be trustworthy. To come to this conclusion, we analyzed the passage using another measure called SMOG (Simple Measure of Gobbledygook).

Whereas Flesch Reading Ease index rates how easy it is to read a passage on a scale of 0 to 100, SMOG gauges how hard it is to read a passage by calculating the proportion of words with 3 or more syllables.

The SMOG Calculator of G. Harry McLaughlin (its founder) rated the passage as more difficult than the Harvard Business Review yet slightly less difficult than The IRS Tax Code. This is a vote of confidence for the Flesch Reading Ease as a metric and for the results shown in Word.

Why Should IQNavigator Care?

The IQNavigator benefits page is very difficult to read. If we can make it easier to read, IQNavigator will attract new business and deepen its position with existing customers.

Copywriting Tune-up (Part Two)

Clearly, IQNavigator has the right idea dedicating a webpage to call-out their benefits. Still, as we've seen with other enterprise software companies, they write in third person voice and use passive sentences. This makes it harder for anyone to read and understand no matter what their IQ.

IQNavigator wisely uses bullet points to enumerate their benefits; however, each bullet makes for long and dense reading.

So, the challenge for this tune-up is to:

  • Inject the second person voice
  • Eliminate passive sentences
  • Maintain an appropriate corporate tone
  • Make the benefits easier to understand
  • Convert each lengthy bullet into several shorter ones

Before

After

Fast cost savings, Ongoing investment

Enterprises in diverse industries have found that sustainable cost savings and process improvements can be achieved through implementation of an end-to-end services procurement and optimization solution. IQNavigator's market-leading solution provides several bottom-line benefits:

  • Cost reduction: Reduce costs by 10-35% by implementing best -practices for sourcing services, eliminating manual invoice reconciliation, gaining consistent terms and renegotiating with more accurate spending and performance information, and enforcing approvals for all spending, contract extensions and exceptions.
  • Process efficiencies: Automate the procurement and payment processes to reduce cycle time and cost over 70% while improving the resulting services quality, contract terms, and payment speed and accuracy.
  • Manage compliance risks: Ensure compliance with company policies, supplier contract terms and government regulations through configurable compliance rules and approval requirements, and enforcement of contract terms and rates. Financial compliance is also achieved through spending approval requirements and process controls, auditability, and accurate invoicing and cost allocation.
  • Optimization: Improve the business results achieved through outside services by aligning services spending with business priorities and initiatives, continually improving deliverable quality and value, and linking purchased services to internal key business measures. IQNavigator's distinctive business intelligence capabilities provide visibility and analysis capabilities into spending, supplier performance, and business results.

Gain Control over the Service Procurement Life Cycle

Join the companies in every sector who have reduced their costs and streamlined their processes with a complete solution to manage the services they procure and the quality of the services performed.

Address this challenge head-on and you can:

  • Lower your costs by 10 - 35%.
  • Adopt the best practices to procure services.
  • Stop reconciling invoices manually.
  • Standardize the terms in your contracts.
  • Renegotiate your contracts using more accurate spending and performance information.
  • Enforce approvals for all your spending, contract extensions, and exceptions.

When you automate your procurement and payment processes, you will:

  • Reduce cycle time and related costs by 70%.
  • Improve the quality of the services performed.
  • Gain better control over contract terms, accuracy, and speed of payment.

Setup compliance rules so you can:

  • Ensure your company complies with its own policies, suppliers' contract terms and government regulations.
  • Meet your financial compliance goals with approval requirements, process controls, audit specifications, and accurate invoicing and cost allocations.

Enjoy better performance from your service providers when you:

  • Match your spending with your business priorities.
  • Link the services you purchase to your key business metrics.
  • Improve the quality and value of the specifications you give to your service providers.

See the relationships among spending, supplier performance, and business results when you apply the unique business intelligence capabilities of IQNavigator.

Readability Statistics

In Part One , we promised to use composite results for Flesch Reading Ease and SMOG. We found consistent results across all the different tools. The only real deviation was Aella Lei's Writing Sample Analyzer with a Flesch Reading Ease of 11.05 for the Before snippet.

Averaged across 3 different tools (Word, Flesh, and Writing Sample Analyzer), the Before snippet scores a 3.68. The After snippet, with an average of 36.60, improves the Flesch Reading Ease composite by a factor of 9.95.

The SMOG composite is the average of results from the SMOG Calculator and Writing Sample Analyzer's FOG measure. The Before snippet scored a 20.67 – well into IRS Tax Code territory for difficulty of reading. The After snippet average is 13.06 – just enough to take it out of Time Magazine and into The New York Times.

Finally, the After snippet eliminates all passive sentences for added clarity.

Place Heady Headlines in a Guillotine but Don't be Afraid to Stick Your Neck Out

Nouns are "headier" than verbs. Nouns require us to think, "What is this thing?" whereas verbs prompt us to, "Just do it." As a B2B marketer, you don't want the reader to ponder anything. Instead, with zero friction, you want to answer their most pressing question - "what's in it for me?" If you answer what's in it for them then your headline also serves to summarize the main takeaway of the piece.

The first half of the Before snippet headline, even if it doesn't start with an action verb, has the right idea by focusing on a benefit. Unfortunately, the second half doesn't contain a verb either and its meaning as a noun is cryptic at best. Does it mean an investment paying regular dividends or having to continually shell out precious investment capital to get the full benefit of the software?

So, "Ongoing investment" can sound positive or negative depending on the context. Since the context is not yet clear, this only compounds the confusion and confusion in a headline means the reader will:

  • Fail to see what's in it for them
  • Abandon the page

The After snippet opens with an action verb. More importantly, it offers a promise – go wth IQNavigator and you'll get your house in order when it comes to managing all of the services you outsource. Frankly, I think I could have written a more powerful promise but this does the job because it clearly states the main takeaway to be gleaned from reading on.

State Your BIG IDEA Clearly and Avoid Weasel Words

The body copy immediately following your headline is "The Lead." The Lead delivers your BIG IDEA and in so doing, emotionally hooks your prospect into reading the rest of the page. To do this, the BIG IDEA must be powerful and compelling.

The Before snippet opens this crucial part of the copy with a passive sentence in 3 rd person voice. In a previous post , we discussed passive sentences at length and how they are harder for readers to understand because they leave out the subject of the sentence. Often, writers will use passive sentences to avoid assigning responsibility for the outcome of an action.

In this case, the first sentence reads as if we're discussing a phenomenon in nature with no readily identifiable cause. The culprit words are "can be" as in, "Sustainable cost savings and process improvements ‘can be' achieved through implementation…" The phrase "can be" will register with the reader as "weasel words" and undermine any promise made in the headline.

Some might say it's reasonably clear who the subject in this sentence is – "it's enterprises in diverse industries." Not so. All it says is what these enterprises "have found." For all we know, IQNavigator is relating the findings of research these enterprises undertook and nothing more. Nowhere does it say an enterprise implemented an "end-to-end services procurement and optimization solution."

Several other things dilute the power of this lead. One of them is using the verb "found" in the present perfect tense as in "Enterprises in diverse industries have found…" You won't find what I'm about to say at English Grammar Online . By placing "have" in front of a non-action verb like "found," we add another layer of indirection between "Enterprises in diverse industries" and the result IQNavigator wishes to communicate.

Yet another layer of indirection kicks in with the word "that" following "have found." "That" is a signal to the reader to get ready and think at a more abstract level. Nothing kills a trance like having to think more conceptually.

Things get even more abstract when you refer to benefits in noun form instead of connecting them with a subject using verbs. "Sustainable cost savings" and "process improvements" are heady ideas when expressed as nouns. The After snippet gets around this with "Join the companies in every sector who have reduced their costs and streamlined their processes…" Using the past perfect tense with tangible verbs works well when placed in an active sentence.

Can You Describe Your Product in Plain English?

It's important to refer to the noun you're selling as concretely as possible. Does an "end-to-end services procurement and optimization solution" seem easy to digest? Remember, this passage is still in overview mode, so an adjective like "end-to-end" is completely opaque. Adding to this sense of mystery are the two abstract nouns embedded in this product definition – "services procurement" and "services optimization."

Again, nouns are heady and verbs go to the gut. This is why the After snippet refers to the product as a "complete solution to manage the services they procure and the quality of the services performed ." Sure, the After snippet takes 15 words to describe the product where the Before snippet needed only 6 but what's more important here – being brief and indecipherable or longer and easy to understand?

Is it About You and Them or What You can Do for Them?

If you're still in The Lead conveying your BIG IDEA, the last thing you want to do is shift the focus from the prospect to yourself. The Before snippet slips into self-centeredness by starting the second sentence with "IQNavigator's market-leading solution…" This sentence recovers the appropriate focus by ending with "…provides several bottom-line benefits." Still, it comes up short for a couple of reasons.

First, it leads into the bullets with a noun ("bottom-line benefits") and it forces each bullet to start with an even headier noun (e.g., "Cost reduction", "Process efficiencies", "Optimization"). The one bullet starting with a verb ("Manage compliance risks") has an awkward flow if read straight from the lead-in.

Second, this sentence is written in the third person at just the moment we're setting up the prospect to picture this benefit in their mind's eye. Third person prompts the prospect to imagine these benefits in a fainter way because they must now try to make sense of them in a context external to their own situation. As such, the Before snippet turns this "picturing process" into an academic exercise.

Thanks to a clear promise in the headline and an unambiguous BIG IDEA in The Lead, the After snippet flows into the "picturing process" with a simple sentence focused on the prospect's interests. This lead-in sentence ties cleanly into 6 short and sweet bullets, each one starting with an action verb and describing a benefit to support the promise.

Ever Get the Feeling Something's Missing?

Ideally, we'd follow these bullets with copy to prove IQNavigator delivers on these benefits. After a promise and a picture, proof helps cement the bond we're trying to establish with the prospect.

An additional section on what makes IQNavigator unique would flesh out this benefits page. Once our prospect has shown an interest in the promise, pictured the benefits in her mind's eye, and agreed with the proof, learning how IQNavigator is unique would bring us dangerously close to what we want – follow-through on our call-to-action.

Oops, we have a problem…. there is no call-to-action.

Obviously, with enterprise software, we're not going to hypnotize anyone into making a purchase on the spot but all kinds of things are possible:

  • Check out upcoming events
  • Signup for a webinar
  • Read drill-down documents like a datasheet or white paper

Wrap-up

By using action verbs liberally and sprinkling "you" and "your" throughout, we use what I called, in a previous post , "implied second person voice." Implied second person voice helps convert features into benefits with a more immediate feel so prospects can easily see "what's in it for them." If you're a B2B marketer, implied second person voice is your ticket to bypass "sales cheesiness."

As a B2B marketer with complex products, you should use bullets to make your benefits and proof points come across clearly and forcefully. Avoid cramming too much into a single bullet – additional scrolling is worth the added white space and clearer copy. After all, "easier to read means more sales and leads."

To your marketing success,

Eric Rosen
Strategic Marketing Writer
eric.rosen AT clearcrisp.com
Clear Crisp Communications
Easier to Read Means More Sales and Leads

Add to: | blinklist | del.cio.us | digg | yahoo! | furl | rawsugar | shadows | netvouz

technorati tags: , , , , , , , ,

del.icio.us tags: , , , , , , , ,

icerocket tags: , , , , , , , ,


          Give Your B2B Marketing Materials a SMOG Test and Find Out if They Read like Newsweek or The IRS Tax Code (Part One)        

Overview

Sometimes, a piece of B2B marketing literature defies conventional wisdom when it comes to measuring its ease of reading. The one we look at this time scored a zero on the Flesch Reading Ease and for me, alarm bells went off. After all, no piece is completely unreadable, right?

Nevertheless, I want to make it clear I'm doing everything humanly possible to give this piece a fair shake.

To eliminate Microsoft Word's implementation of Flesch Reading Ease as a factor, we look at several alternative tools for calculating Flesch Reading Ease. To eliminate the Flesch Reading Ease as an unreliable metric, we use an additional measure called SMOG (Simple Measure of Gobbledygook).

Loosely speaking, SMOG complements Flesch Reading Ease because it measures how difficult it is to read a passage whereas Flesch Reading Ease calculates how easy it is to read. The SMOG level depends on the proportion of words with 3 or more syllables. The higher the SMOG level, the more difficult it is to read the passage. The SMOG level helps equate a passage with other well-known reading materials of similar complexity. This is how references to The IRS Tax Code and Newsweek made it into the title of this post.

So, this edition of Copywriting Tune-ups is Part One and it focuses on my efforts to arrive at a genuine assessment of how easy or difficult it is to read the B2B marketing piece we discuss below. In Part Two, we'll do the usual deconstruction of how the tune-up transforms the piece into a new and improved sales tool.

Those of you only interested in Part Two may find Part One useful because it relies on the tune-up to tell its story. Originally, I hoped to write a self-contained, one-part tune-up emphasizing the value of using bullet points. Along the way, I discovered how the After snippet readings fluctuated wildly depending on whether periods followed the bullets.

In particular, bullet points without periods greatly influences the results we get with both Flesch Reading Ease and SMOG. For this reason, we'll see the Before and After snippets in Part One and Part Two.

For the balance of Part One, we:

  • Set up the makeover
  • Show the Before and After snippets side-by-side with the usual Microsoft Word screenshots of Flesch Reading Ease
  • Explain the methodology and findings I used to eliminate the Flesch Reading Ease metric and the various Flesch Reading Ease calculation tools as factors in the B2B marketing piece's zero score on Flesch Reading Ease

Call Out Your Benefits - Just Don't Lose Them in SMOG

Since converting features to benefits has been a prevalent theme in recent tune-ups, let's review the Challenge and Benefits webpage of an enterprise software company, IQNavigator of Denver . IQNavigator technology helps large organizations manage all of the services they outsource to other companies.

Clearly, IQNavigator has the right idea dedicating a webpage to call-out their benefits. Still, as we've seen with other enterprise software companies, they write in third person voice and use passive sentences. This makes it harder for anyone to read and understand no matter how well-educated.

IQNavigator wisely uses bullet points to enumerate their benefits; however, each bullet makes for long and dense reading.

Copywriting Tune-up

So, the challenge for this tune-up is to:

  • Inject the second person voice
  • Eliminate passive sentences
  • Maintain an appropriate corporate tone
  • Make the benefits easier to understand
  • Convert each lengthy bullet into several shorter ones which are easier to read

Before

After

Fast cost savings, Ongoing investment

Enterprises in diverse industries have found that sustainable cost savings and process improvements can be achieved through implementation of an end-to-end services procurement and optimization solution. IQNavigator's market-leading solution provides several bottom-line benefits:

  • Cost reduction: Reduce costs by 10-35% by implementing best -practices for sourcing services, eliminating manual invoice reconciliation, gaining consistent terms and renegotiating with more accurate spending and performance information, and enforcing approvals for all spending, contract extensions and exceptions.
  • Process efficiencies: Automate the procurement and payment processes to reduce cycle time and cost over 70% while improving the resulting services quality, contract terms, and payment speed and accuracy.
  • Manage compliance risks: Ensure compliance with company policies, supplier contract terms and government regulations through configurable compliance rules and approval requirements, and enforcement of contract terms and rates. Financial compliance is also achieved through spending approval requirements and process controls, auditability, and accurate invoicing and cost allocation.
  • Optimization: Improve the business results achieved through outside services by aligning services spending with business priorities and initiatives, continually improving deliverable quality and value, and linking purchased services to internal key business measures. IQNavigator's distinctive business intelligence capabilities provide visibility and analysis capabilities into spending, supplier performance, and business results.

 

Gain Control over the Service Procurement Life Cycle

 

Join the companies in every sector who have reduced their costs and streamlined their processes with a complete solution to manage the services they procure and the quality of the services performed.

Address this challenge head-on and you can:

  • Lower your costs by 10 - 35%.
  • Adopt the best practices to procure services.
  • Stop reconciling invoices manually.
  • Standardize the terms in your contracts.
  • Renegotiate your contracts using more accurate spending and performance information.
  • Enforce approvals for all your spending, contract extensions, and exceptions.

When you automate your procurement and payment processes, you will:

  • Reduce cycle time and related costs by 70%.
  • Improve the quality of the services performed.
  • Gain better control over contract terms, accuracy, and speed of payment.

Setup compliance rules so you can:

  • Ensure your company complies with its own policies, suppliers' contract terms and government regulations.
  • Meet your financial compliance goals with approval requirements, process controls, audit specifications, and accurate invoicing and cost allocations.

Enjoy better performance from your service providers when you:

  • Match your spending with your business priorities.
  • Link the services you purchase to your key business metrics.
  • Improve the quality and value of the specifications you give to your service providers.

See the relationships among spending, supplier performance, and business results when you apply the unique business intelligence capabilities of IQNavigator.

Ensuring a Balanced Comparison of Before and After

As mentioned above, we need to adjust for bullet points without periods. To ensure a balanced comparison between the Before and After snippets, the After snippet mimics the Before snippet by placing a period at the end of each of its own bullets.

How Bullet Points without Periods affects the After Snippet

Remove all bullet periods and the Flesch Reading Ease result for the After snippet is 22.6. In my opinion, this dramatically understates the ease of reading we feel intuitively when we read it. Place a period at the end of the last bullet in each group and Flesch Reading Ease rises to 27.1 – still too low - based on my experience with previous tune-ups.

Eliminating Different Implementations of Flesch Reading Ease as Unfair to the Before Snippet

Now, let's look at the Before snippet. At first, I thought the Microsoft Word implementation of the Flesch Reading Ease index may be unduly stern with the Before snippet. After all, it defies common sense to say it's completely unreadable.

I calculated the Flesch Reading Ease myself in a spreadsheet and came out with a slightly negative number. In search of a more satisfying sanity check, I downloaded a free Java application called Flesh and came away with these Before and After results:

Before

After




 

Well, Flesch Reading Ease at 0.0 across 3 measuring tools is hard to dismiss.

No doubt, the After snippet grade level looks inflated and its Flesch Ease of Reading, understated. Whether we can accept the grade level figures of Flesh at face value is beyond the scope of this posting.

Eliminating the Flesch Reading Ease Metric as Unfair to the Before Snippet

Still, I felt a nagging sense, Flesch Reading Ease was treating the Before snippet unfairly. Perhaps, something other than Flesch Reading Ease would provide reasonable results. Enter the SMOG Calculator of G. Harry McLaughlin, founder of the SMOG:

Before

After




 

With a SMOG reading of 18.49, the Before snippet verges on the same reading level as The IRS Tax Code. Unfortunately, like Flesh and Word, the lack of periods following bullets leads to a highly skewed reading for the After snippet - a higher SMOG level than the Before snippet! When we add in periods as shown in the After snippet above, the SMOG level falls between Newsweek and Sports Illustrated:

One Last Attempt to Second-guess both Flesch Reading Ease and SMOG

If SMOG equates the Before snippet with the IRS Tax Code, I can accept it, but it still seems unfair to give it a Flesch Reading Ease of zero. Enter Aella Lei's Writing Sample Analyzer .

Like Flesh and Word, Writing Sample Analyzer is vulnerable to bullet points without periods but the most interesting thing about Writing Sample Analyzer is its calculation of Flesch Reading Ease:

Before

After




 

Instead of zero, Writing Sample Analyzer returns 11.05. Why? I don't know yet but once I hear back from Aella Lei, I'll let you know. While a jump from 0.0 to 11.05 is considerable and raises questions about this implementation of Flesch Reading Ease, 11.05 still makes for difficult reading.

Also note, the Fog Scale really is the exact same thing as SMOG. It looks reasonable for the Before snippet. The Before snippet's grade level also falls in-line with expectations. On the other hand, its results for the After snippet are out-of-whack across-the-board due to periods following only the last bullet point in each bullet point group.

When we add periods to each After snippet bullet, Writing Sample Analyzer responds with:

The Flesch Reading Ease looks about right but the Fog and Grade Level seem too high. Even so, comparing the Before and After snippets for Flesch Reading Ease, SMOG, and Grade Level all seem reasonable on a relative basis.

So, this B2B Marketing Piece Really is Hard to Read

All in all, both Flesch Reading Ease and SMOG support one another. In addition, while various tools for calculating Flesch Reading Ease give slightly different results, we can safely assert none of them are skewing the results to the point of denying this B2B Marketing piece a fair shake.

Wrap-up

To be completely even-handed with a B2B marketing piece scoring zero on the Flesch Reading Ease index, we looked at several implementations of the index. This helped eliminate quirks with any given tool as a contributing factor.

To eliminate Flesch Reading Ease itself as an unreliable barometer, we introduced a measure of reading difficulty called SMOG. SMOG gave us the most intuitive sense of how hard this literature is to read because it equates the piece with other well-known publications in the same SMOG range. On this basis, the piece ranks with The Harvard Business Review at the low end and The IRS Tax Code at the high end.

All of the tools used were susceptible to a lack of periods following bullet points because:

  • they rely on periods to determine the number of sentences in a passage and
  • number of sentences is an input to further calculations.

Since the After snippet breaks out the 4 long bullets of the Before snippet into 14 shorter ones, the After snippet figures improved most dramatically when we added periods to every bullet. It's reasonable to do this because the mind processes bullets as if they were separate sentences or ideas.

Also, adding bullets to the end of each bullet makes for a fair comparison since the Before snippet ends each bullet with a period as well.

In Part Two, we will:

  • use composite figures for both the Before and After snippets
  • rely on those figures to gauge the improvements the tune-up demonstrates

Finally, in Part Two, we'll deconstruct the Before and After snippets to address the challenges we set forth for this tune-up.

 

To your marketing success,

Eric Rosen
Strategic Marketing Writer
eric.rosen AT clearcrisp.com
Clear Crisp Communications
Easier to Read Means More Sales and Leads


          NHL man-games lost and CHIP analysis - 30-game report        
This is my third look for the 2013/14 regular season at which teams have been hit hardest by injuries by trying to place a value on the games missed by players due to injury/illness. 

The concept again - multiply each game missed by a player by his 2013/14 cap charge (including bonuses), then take the aggregate of these figures for each team and divide by 82. This indicator of value lost to a team by injury/illness is called CHIP (Cap Hit of Injured Players).

This analysis covers every team up to its 30th game. (Amazingly, this follows on from my 20-game analysis.)

For a more regular snapshot, CHIP rankings are also being fed into Rob Vollman's Team Luck calculator on a weekly basis and if I find time after figuring out how to best to incorporate a reference to Derick Brassard's injured posterior in the next update, I'll do my best to put out the same info via Twitter (@LW3H).

Alternatively...
Again, for a different indicator of player "value", I've also illustrated a similar metric based on TOI/G alongside the CHIP numbers.  Clearly, neither cap charge nor TOI/G are perfect measures of player value, since each have a number of limitations and inconsistencies, but they provide a decent comparison and the results do vary somewhat.

A quick summary of the alternative metric:
  • TOI/G replaces cap charge as the measure of value in the calculation
  • For goalies, TOI/G has been worked out as Total Minutes Played / Games Dressed For* - i.e. a goalie playing every minute of 75% of the games, zero in the rest, would end up with a TOI/G of 45 minutes (or close to it, once you factor in OT and so on).  [*Actually, "Games Played by Team - Games Missed by Goalie" - I'm not inclined to disentangle any three-goalie systems or minor-league conditioning stints.]
  • This arguably overstates the worth of starting goalies somewhat, but it's simple and you could equally argue that a workhorse goalie is the hardest position to replace, so it's fair for them to have a much higher TOI/G figure
  • Where a player hasn't played all year or where a player fairly clearly has a reduced TOI/G figure due to getting injured in their only game or one of very few games, I've used TOI/G from last season (or further back if necessary)
  • For each player, multiply games missed by TOI/G to get (for a more palatable name) Cumulative Minutes of Injured Player (CMIP)
  • Take the aggregate of CMIP for the team and divide by games played by the team to arrive at AMIP (Average Minutes of Injured Players) - it feels more understandable expressing this metric as an average per game (whereas CHIP is a running total)
The figures...
The table below shows:
  • Total CHIP for each team over the first 30 games of the regular season, as well as the distribution of CHIP by position
  • The player who has contributed most to the team's CHIP figure
  • The number of players with a CHIP contribution of over $250,000 (think of it as being equivalent to a $1m player missing 20 games or a $4m player missing five games)
  • AMIP for each team over the same period (e.g. an AMIP of 40:00 could be seen as the team missing two 20-minute per game players for every game this season) - for non-interesting reasons, average TOI figures aren't quite measured at the 30-game point this time around for a few teams, but of no great significance to the analysis

The following is a ranking of teams by CHIP over Games 21-30 only, to further illustrate some of the biggest movers since last time:


10 second analysis...
  • Columbus (Horton, Gaborik, Dubinsky) and Tampa Bay (Stamkos and a bunch of not-Stamkoses) the most notable risers
  • Ottawa's only significant losses remain on the scoreboard and in Eugene Melnyk's bank balance
  • With only one game missed among the group, the health of the Rangers defense clearly benefitting from the shift away from the maniac Tortorella's shot-block-at-all-costs philosophy, with only Vancouver's blueline being healthier, clearly benefitting from the shift towards the maniac Tortorella's shot-block-at-all-costs philosophy (Staal and Edler injuries after Game 30 not yet showing up here)
  • As you would expect, the lack of injuries in Buffalo over the 10-game period has contributed to the team's dramatic turnaround to elite status
  • As you would expect, the lack of goaltender injuries in Calgary over the whole season has contributed to the team's stellar save percentage

The next lists are the top 30 individual CHIP and CMIP contributions:

As the video shows, Tim Gleason's absences have been of precisely the same value as Evgeni Nabokov's in terms of TOI, with the obvious flaw being that Gleason's save percentage is much, much better than Nabokov's.

Where does it hurt?
This is another update of the crude injury-by-location analysis. Again, I’ve just used the descriptions found in the player profiles on tsn.ca, so the figures will encompass all the inaccuracies and vagueness within them. It should give a broad indication, if nothing else, though.

 
Again, as more players have come back from injuries pre-dating the season starting, the crude rate of injuries (instances / total games played) has drifted down further from 0.91 (10 games) to 0.86 (20 games) to 0.83, compared to 0.80 per game last year (0.78 in 2011/12 and 0.76 in 2010/11).

Finally, a look at the Evasiveness Index.  This is basically the proportion of injury instances for each team that have been described as either "Undisclosed" or the helpfully pointless "Upper/Lower Body" in the same TSN profiles.  I have made no judgement about whether the many instances of "Illness" (i.e. concussion), "Flu" (i.e. concussion) should also be included.


A couple of rare lapses of disclosure in Carolina.  Interestingly, the Flames have a regular injury update page on their official website, though it still doesn't preclude them from having one or two UBI/LBI cases, nor from Brian Burke firing the page so he can announce the injuries himself in a regular press conference (but not during his annual Christmas injury freeze).

Notes/Disclaimers
  • Figures exclude a few minor-leaguers / marginal NHLers (usually an arbitrary judgement on my part) who had been on the NHL club’s IR since pre-season. Generally, if a minor-leaguer gets called up and then injured in an NHL game, his games missed will then count towards the CHIP though.  Minor-league conditioning stints immediately after/during a period on IR might be included in the man-games lost figures (but can't guarantee I get it right every time)
  • For the avoidance of doubt, suspensions and absences due to "personal reasons" are not included in the figures.  However, as per previous seasons, any "retired" player still under contract (Savard, Pronger, Ohlund) is still included.
  • There are undoubtedly a few inaccuracies and inconsistencies in there - I do the best I can with the information out there. Corrections might well be picked up in subsequent updates
  • The cap figure obviously doesn't really correlate very well to the "worth" of a player in some cases, e.g. where players are seeing out an old (underpaid or rookie) contract or where players are horrendously overpaid and/or were signed by Paul Holmgren
  • Also, for any player traded where cap hit is retained by his old team, the cap hit used will only reflect that for his current team.
  • I've once again stuck a full team-by-team listing of games missed and CHIP/CMIP numbers by each player on the web HERE
  • Injury/games/TOI info courtesy of tsn.ca and nhl.com - man-games lost info more than likely does not exactly match up with the "official" figures released by individual teams
  • Cap info courtesy of capgeek.com

          NHL man-games lost and CHIP analysis - 20-game report        
This is my second look for the 2013/14 regular season looking at which teams have been hit hardest by injuries by trying to place a value on the games missed by players due to injury/illness. 

The concept again - multiply each game missed by a player by his 2013/14 cap charge (including bonuses), then take the aggregate of these figures for each team and divide by 82. This indicator of value lost to a team by injury/illness is called CHIP (Cap Hit of Injured Players).

This analysis covers every team up to its 20th game. (Amazingly, this follows on from my 10-game analysis.)

For a more regular snapshot, CHIP rankings are also being fed into Rob Vollman's Team Luck calculator on a weekly basis and I'll do my best to put out the same info via the medium known to Randy Carlyle as "that Wheatie machine or whatever" (@LW3H).

Alternatively...
Again, for a different indicator of player "value", I've also illustrated a similar metric based on TOI/G alongside the CHIP numbers.  Clearly, neither cap charge nor TOI/G are perfect measures of player value, since each have a number of limitations and inconsistencies, but they provide a decent comparison and the results do vary somewhat.

A quick summary of the alternative metric:
  • TOI/G replaces cap charge as the measure of value in the calculation
  • For goalies, TOI/G has been worked out as Total Minutes Played / Games Dressed For* - i.e. a goalie playing every minute of 75% of the games, zero in the rest, would end up with a TOI/G of 45 minutes (or close to it, once you factor in OT and so on).  [*Actually, "Games Played by Team - Games Missed by Goalie" - I'm not inclined to disentangle any three-goalie systems or minor-league conditioning stints.]
  • This arguably overstates the worth of starting goalies somewhat, but it's simple and you could equally argue that a workhorse goalie is the hardest position to replace, so it's fair for them to have a much higher TOI/G figure
  • Where a player hasn't played all year or where a player fairly clearly has a reduced TOI/G figure due to getting injured in their only game or one of very few games, I've used TOI/G from last season (or further back if necessary)
  • For each player, multiply games missed by TOI/G to get (for a more palatable name) Cumulative Minutes of Injured Player (CMIP)
  • Take the aggregate of CMIP for the team and divide by games played by the team to arrive at AMIP (Average Minutes of Injured Players) - it feels more understandable expressing this metric as an average per game (whereas CHIP is a running total)
The figures...
The table below shows:
  • Total CHIP for each team over the first 20 games of the regular season, as well as the distribution of CHIP by position
  • The player who has contributed most to the team's CHIP figure
  • The number of players with a CHIP contribution of over $250,000 (think of it as being equivalent to a $1m player missing 20 games or a $4m player missing five games)
  • AMIP for each team over the same period (e.g. an AMIP of 40:00 could be seen as the team missing two 20-minute per game players for every game this season) 

The following is a ranking of teams by CHIP over Games 11-20 only, to further illustrate some of the biggest movers since last time:


10 second analysis...
  • The Central isn't injury central
  • Boston, Philadelphia and (to a lesser extent) Tampa Bay have been much healthier than it would appear if you deduct the totals from their pseudo-retired players (see below)
  • San Jose lead the way in Burns injuries
  • Ottawa will presumably start injuring their own players, if they run any sort of points-to-injury correlation analysis covering last season and this season only
  • Among Buffalo's throng of fringe NHLers/borderline AHLers/Grigorenkos, I had originally excluded Corey Tropp's (eventual 15-game) absence from their numbers, but have now included it since he played games 16-20

The next lists are the top 30 individual CHIP and CMIP contributions:

The CMIP table is goalie heavy (insert Brodeur gag here) as is usually the case.  Carolina with four of the top 17 CHIP numbers, giving opposition team announcers something to talk about in the 30 seconds they have to spare between "Look! Two brothers on the same team!" conversation.

Where does it hurt?
This is another update of the crude injury-by-location analysis. Again, I’ve just used the descriptions found in the player profiles on tsn.ca, so the figures will encompass all the inaccuracies and vagueness within them. It should give a broad indication, if nothing else, though.

 
An impressive early season showing for UBI/LBI cases, which only (?) made up 22% of the injury instances last season.  As more players have come back from injuries pre-dating the season starting, the crude rate of injuries (instances / total games played) has drifted down from 0.91 to 0.86, compared to 0.80 per game last year (0.78 in 2011/12 and 0.76 in 2010/11).

Finally, a look at the Evasiveness Index.  This is basically the proportion of injury instances for each team that have been described as either "Undisclosed" or the helpfully pointless "Upper/Lower Body" in the same TSN profiles.  I have made no judgement about whether the many instances of "Illness" (i.e. concussion), "Flu" (i.e. concussion) should also be included.


Phoenix and Carolina no strangers to the top of this table, Edmonton and Colorado no strangers to the other end.

Notes/Disclaimers
  • Figures exclude a few minor-leaguers / marginal NHLers (usually an arbitrary judgement on my part) who had been on the NHL club’s IR since pre-season. Generally, if a minor-leaguer gets called up and then injured in an NHL game, his games missed will then count towards the CHIP though.  Minor-league conditioning stints immediately after/during a period on IR might be included in the man-games lost figures (but can't guarantee I get it right every time)
  • For the avoidance of doubt, suspensions and absences due to "personal reasons" are not included in the figures.  However, as per previous seasons, any "retired" player still under contract (Savard, Pronger, Ohlund) is still included.
  • There are undoubtedly a few inaccuracies and inconsistencies in there - I do the best I can with the information out there. Corrections might well be picked up in subsequent updates
  • The cap figure obviously doesn't really correlate very well to the "worth" of a player in some cases, e.g. where players are seeing out an old (underpaid or rookie) contract or where players are horrendously overpaid and/or were signed by Paul Holmgren
  • Also, for any player traded where cap hit is retained by his old team, the cap hit used will only reflect that for his current team.
  • I've once again stuck a full team-by-team listing of games missed and CHIP/CMIP numbers by each player on the web HERE
  • Injury/games/TOI info courtesy of tsn.ca and nhl.com - man-games lost info more than likely does not exactly match up with the "official" figures released by individual teams
  • Cap info courtesy of capgeek.com

          NHL man-games lost and CHIP analysis - 10-game report        
This is my first look for the 2013/14 regular season looking at which teams have been hit hardest by injuries by trying to place a value on the games missed by players due to injury/illness.

The concept again - multiply each game missed by a player by his 2013/14 cap charge (including bonuses), then take the aggregate of these figures for each team and divide by 82. This indicator of value lost to a team by injury/illness is called CHIP (Cap Hit of Injured Players).

All being well, I'll be cutting the season into 10-game chunks for ease of comparability.  So this analysis covers every team up to its 10th game.  Blame the St. Louis Blues for being the last team to get there.

For a more regular snapshot, CHIP rankings are being fed into Rob Vollman's world-famous, Puck Daddy commenter approved, Team Luck calculator.  So go there, people.

Alternatively...
Again, for a different indicator of player "value", I've also illustrated a similar metric based on TOI/G alongside the CHIP numbers.  Clearly, neither cap charge nor TOI/G are perfect measures of player value, since each have a number of limitations and inconsistencies, but they provide a decent comparison and the results do vary somewhat.

A quick summary of the alternative metric:
  • TOI/G replaces cap charge as the measure of value in the calculation
  • For goalies, TOI/G has been worked out as Total Minutes Played / Games Dressed For* - i.e. a goalie playing every minute of 75% of the games, zero in the rest, would end up with a TOI/G of 45 minutes (or close to it, once you factor in OT and so on).  [*Actually, "Games Played by Team - Games Missed by Goalie" - I'm not inclined to disentangle any three-goalie systems or minor-league conditioning stints.]
  • This arguably overstates the worth of starting goalies somewhat, but it's simple and you could equally argue that a workhorse goalie is the hardest position to replace, so it's fair for them to have a much higher TOI/G figure
  • Where a player hasn't played all year or where a player fairly clearly has a reduced TOI/G figure due to getting injured in their only game or one of very few games, I've used TOI/G from last season (or further back if necessary)
  • For each player, multiply games missed by TOI/G to get (for a more palatable name) Cumulative Minutes of Injured Player (CMIP)
  • Take the aggregate of CMIP for the team and divide by games played by the team to arrive at AMIP (Average Minutes of Injured Players) - it feels more understandable expressing this metric as an average per game (whereas CHIP is a running total)
The figures...
The table below shows:
  • Total CHIP for each team over the first 10 games of the regular season, as well as the distribution of CHIP by position
  • The player who has contributed most to the team's CHIP figure
  • The number of players with a CHIP contribution of over $250,000 (think of it as being equivalent to a $1m player missing 20 games or a $4m player missing five games)
  • AMIP for each team over the same period (e.g. an AMIP of 40:00 could be seen as the team missing two 20-minute per game players for every game this season) 
10 second analysis...
The Sharks have obviously struggled badly without Marty Havlát, Raffi Torres, Adam Burish and Tomáš Hertl's sense of respect and dignity.  Meanwhile, the Kings, Blackhawks and Blues are demonstrating that they don't have enough grit and toughness to play in the gritty, tough areas needed to be successful.

The next lists are the top 30 individual CHIP and CMIP contributions:

Rick Nash aside, most of the CHIP leaders have been out all for all 10 games, which Nash tells me is "good".

Where does it hurt?
This is another update of the crude injury-by-location analysis. Again, I’ve just used the descriptions found in the player profiles on tsn.ca, so the figures will encompass all the inaccuracies and vagueness within them. It should give a broad indication, if nothing else, though.

 
Clearly very early days (and no attempt made to adjust for injuries pre-dating the season starting), but the crude rate of injuries (instances / total games played) stands at 0.91, compared to 0.80 per game last year (0.78 in 2011/12 and 0.76 in 2010/11).

Notes/Disclaimers
  • Figures exclude a few minor-leaguers / marginal NHLers (usually an arbitrary judgement on my part) who had been on the NHL club’s IR since pre-season. Generally, if a minor-leaguer gets called up and then injured in an NHL game, his games missed will then count towards the CHIP though.  Minor-league conditioning stints immediately after/during a period on IR might be included in the man-games lost figures (but can't guarantee I get it right every time)
  • For the avoidance of doubt, suspensions and absences due to "personal reasons" are not included in the figures.  However, as per previous seasons, any "retired" player still under contract (Savard, Pronger, Ohlund) is still included.
  • There are undoubtedly a few inaccuracies and inconsistencies in there - I do the best I can with the information out there. Corrections might well be picked up in subsequent updates
  • The cap figure obviously doesn't really correlate very well to the "worth" of a player in some cases, e.g. where players are seeing out an old (underpaid or rookie) contract or where players are horrendously overpaid and/or were signed by Paul Holmgren
  • Also, for any player traded where cap hit is retained by his old team, the cap hit used will only reflect that for his current team.
  • I've once again stuck a full team-by-team listing of games missed and CHIP/CMIP numbers by each player on the web HERE
  • Injury/games/TOI info courtesy of tsn.ca and nhl.com - man-games lost info more than likely does not exactly match up with the "official" figures released by individual teams
  • Cap info courtesy of capgeek.com

          The Pain Game 2013 - end of season wrap        
Injury stats update – end of season awards

This is my final update for the 2012/13 regular season looking at which teams have been hit hardest by injuries by trying to place a value on the games missed by players due to injury/illness.  (If you want to see the Q3 analysis, here is the Q3 analysis.)

[At some point in the summer, I'll probably rework the numbers using GVT again and try to do another summary analysis of the last five years' of CHIP figures, maybe with attractive charts of CHIP vs standings points and other exciting stuff like that. Stay tuned.]

The concept again - multiply each game missed by a player by his (annual) 2012/13 cap charge, then take the aggregate of these figures for each team and divide by 82. This indicator of value lost to a team by injury/illness is called CHIP (Cap Hit of Injured Players).

Note that for reasons of comparability, players' cap hits being published as full-season equivalents and above all, laziness, I have made no attempt to adjust the calculations to account for the 48-game season.

Alternatively...
Again, for a different indicator of player "value", I've also illustrated a similar metric based on TOI/G alongside the CHIP numbers.  Clearly, neither cap charge nor TOI/G are perfect measures of player value, since each have a number of limitations and inconsistencies, but they provide a decent comparison and the results do vary somewhat.

A quick summary of the alternative metric:
  • TOI/G replaces cap charge as the measure of value in the calculation
  • For goalies, TOI/G has been worked out as Total Minutes Played / Games Dressed For* - i.e. a goalie playing every minute of 75% of the games, zero in the rest, would end up with a TOI/G of 45 minutes (or close to it, once you factor in OT and so on).  [*Actually, "Games Played by Team - Games Missed by Goalie" - I'm not inclined to disentangle any three-goalie systems or minor-league conditioning stints.]
  • This arguably overstates the worth of starting goalies somewhat, but it's simple and you could equally argue that a workhorse goalie is the hardest position to replace, so it's fair for them to have a much higher TOI/G figure
  • Where a player hasn't played all year or where a player fairly clearly has a reduced TOI/G figure due to getting injured in their only game or one of very few games, I've used TOI/G from last season (or further back if necessary)
  • For each player, multiply games missed by TOI/G to get (for a more palatable name) Cumulative Minutes of Injured Player (CMIP)
  • Take the aggregate of CMIP for the team and divide by games played by the team to arrive at AMIP (Average Minutes of Injured Players) - it feels more understandable expressing this metric as an average per game (whereas CHIP is a running total)
The figures...
The table below (playoff teams highlighted in yellow) shows:
  • Total CHIP for each team over the 48-game regular season, as well as the distribution of CHIP by position
  • The player who has contributed most to the team's CHIP figure
  • The number of players with a CHIP contribution of over $250,000 (think of it as being equivalent to a $1m player missing 20 games or a $4m player missing five games)
  • AMIP for each team over the same period (e.g. an AMIP of 40:00 could be seen as the team missing two 20-minute per game players for every game this season) 

The following is a ranking of teams by CHIP over Games 37-48 only, to further illustrate some of the biggest movers since last time:


10 second analysis...
A tight race between the three teams that achieved some separation from the rest was eventually won by the Flyers by a thin margin, but with the usual caveat that $3.2m of their total comes from the known/permanent absences of Chris Pronger and Marc-Andre Bourdon.  Ed Snider is presumably briefing Bob Clarke on exactly which unfortunate injuries to administer to protect Danny Briere and Ilya Bryzgalov against prior to next season.

One or two fairly stark differences between the man-games lost figure and the corresponding CHIP ranking are evident, e.g. Detroit, Pittsburgh, Calgary, Montreal.

The Pacific Division seemingly very healthy, with no team higher than the Kings' 16th place, despite Raffi Torres staying within the division at the trade deadline.

Winners of positional crowns:
Goaltenders: Hurricanes [Cam Ward with a destroyed MCL still the least-worst goalie in the Southeast]
Defensemen: Flyers [nope, I didn't realise they ever had any defensemen either]
Forwards: Panthers [bad idea for Dale Tallon to arrange delivery of the players' medication]

And the paper hats:
Goaltenders: 7-way tie [mostly successful workhorse starters, plus Ryan Miller]
Defensemen: Blues [presumably the only reason they could find to trade away Wade Redden]
Forwards: Kings [34 fewer chances for Carter and Richards to be "ill" and miss a game]

The next lists are the top 30 individual CHIP and CMIP contributions:

So the Senators had some key players out for a while, huh?

Players under contract who missed all 48 games:
Mitchell (Los Angeles)
Sutton (Edmonton)
Savard (Boston) [second full season out]
Sauer (NY Rangers)
Pronger (Philadelphia)
Bourdon (Philadelphia)
Bergenheim (Florida)
Ohlund (Tampa Bay) [second full season out]

Where does it hurt?
This is another update of the crude injury-by-location analysis. Again, I’ve just used the descriptions found in the player profiles on tsn.ca, so the figures will encompass all the inaccuracies and vagueness within them. It should give a broad indication, if nothing else, though.

 
No massive shift in the distribution of injuries compared to last year, unless you count Colorado's new-found liberal use of "torso" injuries as a descriptor.

Man-games lost in Q4 total 1,218 compared to 1,178 in Q3, 1,126 in Q2 and 816 in Q1.  The crude rate of injuries (instances / total games played) ended up at 0.80 per game compared to 0.78 per game last year (0.76 per game in 2010/11).  Man-games lost per game this year ended up at 6.0, down from 6.8 last year (6.6 in 2010/11).

Finally, a look at the Evasiveness Index.  This is basically the proportion of injury instances for each team that have been described as either "Undisclosed" or the helpfully pointless "Upper/Lower Body" in the same TSN profiles.  I have made no judgement about whether the many instances of "Illness" (i.e. concussion), "Flu" (i.e. concussion), "None of your business" (i.e. Rick Nash concussion) or "Helmet heat" (i.e. Maple Leaf concussion) should also be included.


A lot of the usual suspects at the top end of the list, while Edmonton repeat as disclosure champions (which Kevin Lowe will angrily remind everybody of at a press conference to unveil Johnny Muckler-Tavish Jr as the Oilers new GM in 20 years' time), controversially tying with the Red Wings who mysteriously un-undisclosed an injury since my last update.


Notes/Disclaimers
  • Figures exclude a few minor-leaguers / marginal NHLers (usually an arbitrary judgement on my part - you tell me whether Matt Taormina is/was still an NHLer at the start of the season...) who had been on the NHL club’s IR since "pre-season". Generally, if a minor-leaguer gets called up and then injured in an NHL game, his games missed will then count towards the CHIP though.  Minor-league conditioning stints immediately after/during a period on IR might be included in the man-games lost figures (but can't guarantee I get it right every time)
  • There are undoubtedly a few inaccuracies and inconsistencies in there - I do the best I can with the information out there. Some corrections are picked up month-to-month too
  • The cap figure obviously doesn't really correlate very well to the "worth" of a player in some cases, e.g. where rookie bonuses are included this year, where players are seeing out an old (underpaid or rookie) contract or where players are horrendously overpaid
  • Also, for any player who was acquired on re-entry waivers (RIP), the cap hit will only reflect that for their current team, i.e. 50% of the player’s full cap hit (shared between his current and old teams). I have taken a similar approach to players traded where cap hit is retained by his old team - only Jason Pominville of these cases so far has been injured
  • I've once again stuck a full team-by-team listing of games missed and CHIP/CMIP numbers by each player on the web HERE
  • Injury/games/TOI info courtesy of tsn.ca and nhl.com - man-games lost info more than likely does not exactly match up with the "official" figures released by individual teams
  • Cap info courtesy of capgeek.com

          The Pain Game 2013 - Third Quarter Report        
Injury stats update – Q3 analysis

This is my third look for the 2012/13 regular season looking at which teams have been hit hardest by injuries by trying to place a value on the games missed by players due to injury/illness.  (If you want to see the Q2 analysis, here is the Q2 analysis.)

The concept again - multiply each game missed by a player by his (annual) 2012/13 cap charge, then take the aggregate of these figures for each team and divide by 82. This indicator of value lost to a team by injury/illness is called CHIP (Cap Hit of Injured Players).

Note that for reasons of comparability, players' cap hits being published as full-season equivalents and above all, laziness, I have made no attempt to adjust the calculations to account for the 48-game season.

Rather than measuring the figures at month ends, as I have done in previous seasons, I'll be cutting the shortened season into 12-game chunks.  So this analysis covers every team up to the 36-game point.

Alternatively...
Again, for a different indicator of player "value", I've also illustrated a similar metric based on TOI/G alongside the CHIP numbers.  Clearly, neither cap charge nor TOI/G are perfect measures of player value, since each have a number of limitations and inconsistencies, but they provide a decent comparison and the results do vary somewhat.

A quick summary of the alternative metric:
  • TOI/G replaces cap charge as the measure of value in the calculation
  • For goalies, TOI/G has been worked out as Total Minutes Played / Games Dressed For* - i.e. a goalie playing every minute of 75% of the games, zero in the rest, would end up with a TOI/G of 45 minutes (or close to it, once you factor in OT and so on).  [*Actually, "Games Played by Team - Games Missed by Goalie" - I'm not inclined to disentangle any three-goalie systems or minor-league conditioning stints.]
  • This arguably overstates the worth of starting goalies somewhat, but it's simple and you could equally argue that a workhorse goalie is the hardest position to replace, so it's fair for them to have a much higher TOI/G figure
  • Where a player hasn't played all year or where a player fairly clearly has a reduced TOI/G figure due to getting injured in their only game or one of very few games, I've used TOI/G from last season (or further back if necessary)
  • For each player, multiply games missed by TOI/G to get (for a more palatable name) Cumulative Minutes of Injured Player (CMIP)
  • Take the aggregate of CMIP for the team and divide by games played by the team to arrive at AMIP (Average Minutes of Injured Players) - it feels more understandable expressing this metric as an average per game (whereas CHIP is a running total)
The figures...
The table below shows:
  • Total CHIP for each team over the first 36 games of the regular season, as well as the distribution of CHIP by position
  • The player who has contributed most to the team's CHIP figure
  • The number of players with a CHIP contribution of over $250,000 (think of it as being equivalent to a $1m player missing 20 games or a $4m player missing five games)
  • AMIP for each team over the same period (e.g. an AMIP of 40:00 could be seen as the team missing two 20-minute per game players for every game this season) 

The following is a ranking of teams by CHIP over Games 25-36 only, to further illustrate some of the biggest movers since last time:


10 second analysis...
The Senators still without Spezza, Karlsson, Michalek, Anderson, Cowen, Yashin, Bonk, Daigle, Rhodes, Boschman and Sidorkiewicz, among others, so are unsurprisingly still heading the pack in CHIP terms, despite having fewer man-games missed than three other teams.

Despite almost keeping pace with the Senators' injuries and winning the fewest number of games in the whole league, the Panthers impressively still lead the ultra-competitive, high-quality Southeast Divison by 23 points.  The rival Hurricanes lead the league in men lost, having (I think) re-re-re-acquired Zach Boychuk.

The Penguins' expensive-looking outlier in this quarter was mostly due to Evgeni Malkin. The Flames practically had an injury-free quarter, aside from the continued absence of low-cost Paul Byron, plus Jay Feaster's frontal lobotomy.

The next lists are the top 30 individual CHIP and CMIP contributions:
Tuomo Ruutu in the unenviable position of being between Chris Pronger and Ryan Kesler, so past history tells us he might soon be out again with a skate blade laceration.

Where does it hurt?
This is another update of the crude injury-by-location analysis. Again, I’ve just used the descriptions found in the player profiles on tsn.ca, so the figures will encompass all the inaccuracies and vagueness within them. It should give a broad indication, if nothing else, though.

 
Man-games lost in Q3 total 1,178 compared to 1,126 in Q2 and 816 in Q1.  I remain unconvinced that injuries are occurring at a significantly higher rate than previous years - a crude indication, but the rate of injuries (instances / total games played) sits at 0.81 per game compared to 0.78 per game last year (0.76 per game in 2010/11).

Man-games lost per game so far this year are 5.8, down from 6.8 last year and 6.6 in 2010/11.  (All figures clearly distorted somewhat by long-term injuries existing before seasons begin.  And the rapid advances in the league's cutting edge treatment of concussions, of course...)

Finally, a look at the Evasiveness Index.  This is basically the proportion of injury instances for each team that have been described as either "Undisclosed" or the helpfully pointless "Upper/Lower Body" in the same TSN profiles.  I have made no judgement about whether the many instances of "Illness" (i.e. concussion), "Flu" (i.e. concussion) or "None of your business" (i.e. Rick Nash concussion) should also be included.


Four previous possessors of zeros here have clammed up to some degree, leaving the Oilers to fly the flag of transparency alone. Unless Steve Tambellini is just missing a trick here?  Nah...


Notes/Disclaimers
  • Figures exclude a few minor-leaguers / marginal NHLers (usually an arbitrary judgement on my part - you tell me whether Matt Taormina is/was still an NHLer at the start of the season...) who had been on the NHL club’s IR since "pre-season". Generally, if a minor-leaguer gets called up and then injured in an NHL game, his games missed will then count towards the CHIP though.  Minor-league conditioning stints immediately after/during a period on IR might be included in the man-games lost figures (but can't guarantee I get it right every time)
  • There are undoubtedly a few inaccuracies and inconsistencies in there - I do the best I can with the information out there. Some corrections are picked up month-to-month too
  • The cap figure obviously doesn't really correlate very well to the "worth" of a player in some cases, e.g. where rookie bonuses are included this year, where players are seeing out an old (underpaid or rookie) contract or where players are horrendously overpaid
  • Also, for any player who was acquired on re-entry waivers (RIP), the cap hit will only reflect that for their current team, i.e. 50% of the player’s full cap hit (shared between his current and old teams). I guess I'll take a similar approach to a player traded where cap hit is retained by his old team - Jussi Jokinen and Jason Pominville appear to be the test cases once one of them gets injured..
  • I've once again stuck a full team-by-team listing of games missed and CHIP/CMIP numbers by each player on the web HERE
  • Injury/games/TOI info courtesy of tsn.ca and nhl.com - man-games lost info more than likely does not exactly match up with the "official" figures released by individual teams
  • Cap info courtesy of capgeek.com

          EPA launches new greenhouse gas inventory tools for local and tribal governments        
EPA is pleased to announce that it has launched two free, interactive spreadsheet tools to help local governments and tribes across the United States evaluate their greenhouse gas emissions. Local Greenhouse Gas Inventory Tool Tribal Greenhouse Gas Inventory Tool Both tools calculate greenhouse gas emissions for many sectors, including residential, commercial, transportation, and waste and […]
          Operations Communications Coordinator        
Exam number: 
#62-615
Exam type: 
Open Competitive (open to the public)
********************Amendment Issued August 8, 2017*********************** see NOTE under Residence Requirements section below.
Salary: 
SALARY VARIES
Opening Date: 
August 4, 2017
Closing Date: 
September 6, 2017
Examination date: 
October 14, 2017
Application fee: 
$20.00
RESIDENCE REQUIREMENTS:  CANDIDATES MUST HAVE BEEN LEGAL RESIDENTS OF ERIE COUNTY FOR AT LEAST ONE MONTH IMMEDIATELY PRECEDING THE DATE OF THE WRITTEN TEST AND MUST BE LEGAL RESIDENTS OF ERIE COUNTY AT THE TIME OF APPOINTMENT.  There are a total of three current vacancies: one  current vacancy at Erie 1 BOCES*(see NOTE*); one current vacancy in the Erie County Department of Social Services and one current vacancy in the Erie County Division of Information and Support Services (DISS).  The annual salary is $35,025 - $45,664 for Erie County Departments; salaries vary among municipalities and districts.  The eligible list resulting from this examination will be used to fill this vacancy and other appropriate vacancies which may occur in municipalities or districts under the jurisdiction of Erie County Civil Service while this list is active. ********************Amendment Issued August 8, 2017*********************** *NOTE: for Erie 1 BOCES vacancy only: * CANDIDATES MUST HAVE BEEN LEGAL RESIDENTS OF ERIE COUNTY OR AN ERIE COUNTY SCHOOL DISTRICT FOR AT LEAST ONE MONTH IMMEDIATELY PRECEDING THE DATE OF THE WRITTEN TEST AND MUST BE LEGAL RESIDENTS OF ERIE COUNTY OR AN ERIE COUNTY SCHOOL DISTRICT AT THE TIME OF APPOINTMENT. Candidates who reside in a school district which, for civil service purposes is administered by the Erie County Personnel Commissioner, but who are not legal residents of Erie County may apply for examination. However they will be certified for appointment only in the school district in which they are legal residents.
Examples of Duties: 
An Operations Communications Coordinator coordinates and/or installs, operates, maintains and controls remote site communication equipment, terminal equipment and micro based equipment;  If assigned to Division of Information and Support Services (DISS) or BOCES: Installs and/or assists in the installation of communication equipment, terminal equipment and micro based equipment; Acts as liaison between communications users and the data processing department to obtain information necessary for system software modification when communication and terminal equipment is installed; Performs trouble shooting procedures on communication lines, communication, terminal and micro based equipment; Works with systems personnel establishing User and Communication procedures and manuals; Trains users in the operation of remote terminal site equipment; Maintains records that are necessary for determining maintenance, operating efficiency and cost; Assists in providing off hour coverage via call-in lists, beepers or other methods, as required; Provides maintenance in a timely manner in response to user requests. If assigned to Social Services: Acts as a coordinator and liaison with New York State and department; Assists in the planning, coordination and control of installation of terminals, PC's and communication equipment; Performs trouble-shooting procedures on terminals, PC's and communication equipment; Provides and/or arranges for maintenance in a timely manner; Assists in developing and conducting training programs on communication equipment for staff; Maintains records that are necessary for maintenance, operating efficiency and cost; Acts as a Security Coordinator in creating and maintaining necessary records; Controls the ordering, receiving and billing for communication goods and services; Performs other related tasks as necessary.
Qualifications: 
MINIMUM QUALIFICATIONS:  Candidates must meet one of the following requirements on or before the date of the written test:  A. Graduation from a regionally accredited or New York State registered four (4) year college or university with a Bachelor's degree in computer science and one (1) year experience in Communication Processing and Control, which involved responsibility for the installation and operation of Network Controls, Modems, Multiplexers, Front End Processors, Line Monitors, Analog and/or Digital Communication Lines and understanding the functioning of a tele-processing operation; or: B. Graduation from a regionally accredited or New York State registered two (2) year college with an Associate's degree in Telecommunication, Computer Science or related field and three (3) years experience in Communication Processing and Control, at least one (1) year of which involved responsibility for the installation and operation of Network Controls, Modems, Multiplexers, Front End Processors, Line Monitors, Analog and/or Digital Communication Lines and understanding the functioning of a tele-processing operation; or: C. Graduation from high school and five (5) years experience in Communications Processing and Control, at least one (1) year of which involved the responsibility for the installation and operation of Network Controls, Modems, Multiplexers, Front End Processors, Line Monitors, Analog and/or Digital Communication Lines and understanding the functioning of a tele-processing operation; or: D.  An equivalent combination of training and experience as defined by the limits of (A), (B) and (C).
NOTES:  1. Verifiable part-time and/or volunteer experience will be pro-rated toward meeting the experience requirements. 2. Your degree and/or college credit must have been awarded by a regionally accredited college or university or one recognized by the New York State Education Department as following acceptable educational practices. A grade of "D" or better is necessary for a course to be credited as successfully completed. If your degree and/or college credit was awarded by an educational institution outside of the United States and its territories, you must provide independent verification of equivalency. You can write to this Department for a list of acceptable companies providing this service; you must pay the required evaluation fee. Notice to Candidates: Transcripts will now be accepted by the Department of Personnel ONLY at time of application. All subsequent transcripts must be submitted at time of interview.
The New York State Department of Civil Service has not prepared a test guide for this examination. However,candidates may find information in the publication "How to take a written test" helpful in preparing for this test. This publication is available on line at: www.cs.ny.gov/testing/localtestguides.cfm  
Subjects of Examination: 
Subjects of examination: A written test designed to evaluate knowledge, skills and /or abilities in the following areas: 1. Office record keeping These questions test your ability to perform common office record keeping tasks. The test consists of two or more "sets" of questions, each set concerning a different problem. Typical record keeping problems might involve the organization or collation of data from several sources; scheduling; maintaining a record system using running balances; or completion of a table summarizing data using totals, subtotals, averages and percents. You should bring with you a hand-held battery- or solar-powered calculator for use on this test. You will not be permitted to use the calculator function of your cell phone. 2. Fundamentals of PC systems These questions test for knowledge of basic concepts and terminology related to PC's. They cover such topics as PC and peripheral equipment; storage media; types of software used with PC's; and other associated terms and concepts. 3. Use and operation of PC's and related peripheral equipment These questions are designed to test for technical knowledge and concepts relevant to the operation of a PC and associated peripheral equipment for word processing, spreadsheet analysis, data base management, data communications and other applications. The questions asked are not specific to any vendor or any model of PC. 4. Principles of providing user support These questions test for knowledge and skill in working in a user support situation. They cover such subjects as how to communicate effectively with users requesting help; how to deal with different types of situations; troubleshooting techniques; and how to gather, organize and make available technical information needed to provide support. 5. Training users of computers These questions test for knowledge of techniques for using computers and approaches to training others to use them. They cover such subjects as use of computer hardware, software and applications; preparing and evaluating instruction materials; determining the level of trainees' knowledge and the use of computers to provide instruction and feedback. The questions on training depend upon good judgment and practical experience rather than knowledge of abstract principles.  NOTICE TO CANDIDATES:  Unless otherwise noted, candidates are permitted to use quiet, hand held, solar or battery powered calculators.  Devices with typewriter keyboards, "Spell Checkers", “Personal Digital Assistants", "Address Books", "Language Translators", "Dictionaries", or any similar devices are prohibited.  You may not bring books or other reference materials.

          Hero: Deepak Bista        

I’ve had the incredible opportunity to work alongside Deepak during my time here working in the Data department. He is the hospital’s Data and Technology officer. As such, he is in charge of all data entry and reporting in the hospital, and is also responsible for any task relating to technology that needs to be completed. 

Deepak is an incredibly hard worker. He works in a stifling office that has been affectionately nicknamed “The Oven” by other staff for its lack of ventilation and high temperature when it is warm out. Deepak spends hours at a time entering data into databases and spreadsheets, a task that could wear on anyone, but he always has a smile when he does it. 

What is truly incredible about Deepak is his knowledge and drive to continue improving the hospital’s data systems. He is like an artist when working with his spreadsheets, crafting them in a way that will ensure the most utility of the data and the highest accuracy. When data isn’t reported to him at time, he will go out and track it down, even if it takes talking to every single employee in the hospital. Deepak also has a genuine love for the work that he does. He enjoys seeing what the data holds, and is rarely without a smile when seeing it all come together.

Deepak also has an incredible degree of initiative. He is excited to continue working on the data communication plan, and offered many suggestions of his own in its design. After a long meeting discussing ways to improve the pharmacy and store system at the hospital, by the next morning Deepak had constructed an Access database out of scratch that was incredibly functional and has already replaced the hospital’s previous stock system. He can also often be found in his office on Saturdays (the only day of the week that people have off) chugging along in order to get our data reported on time.

Deepak is a great co-worker, always willing to drop what he was doing to lend a hand. You can hear his distinctive laugh even before you see him, knowing that he is either chatting enthusiastically about his work, or about the weather, or about the football game after work.

I’m honored that I’ve been able to work alongside him the past couple months, and will be sad to say goodbye.

Deepak is a hero.

          Met Microsoft Dynamics NAV heeft Heembouw altijd een compleet en actueel inzicht in haar bouwprojecten         
Oorspronkelijk werd de administratie bij Heembouw vooral gevoerd met strikt gescheiden systemen. David Naaraat, hoofd Automatisering van Heembouw: "Er was weinig tot geen integratie, de in gebruik zijnde softwarepakketten deden elk hun eigen kunstje. Gegevens waren voor de medewerkers moeilijk te delen, ook het vergelijken van gegevens was moeilijk, terwijl dat voor een projectorganisatie als Heembouw juist een must is. Het gevolg was dat er allerhande losse spreadsheets in omloop waren voor de processen aan de productiekant, maar het beheer daarvan was zeer arbeidsintensief." Heembouw heeft een aantal ERP-oplossingen van verschillende aanbieders uit de markt bekeken, waaronder ook Dynamics NAV van Microsoft. David Naaraat: "Het integrale platform van Microsoft, de waarborging van de continuïteit, de laagdrempeligheid van Dynamics NAV en de hoge mate van herkenbaarheid hebben ons doen kiezen voor Microsoft Dynamics NAV."
          Oad Groep heeft met Microsoft Enterprise Project Management continu totaalinzicht in alle lopende projecten         
Het nieuwe businessmodel van Oad bracht aan de businesskant een grote verandering van de processen met zich mee. Ook de IT moest hierop aangepast worden. De Oad groep startte daarom vijf jaar geleden een programma op, waar 20 miljoen euro mee gemoeid is en dat uit bijna dertig projecten bestaat. "Wil je al die projecten goed managen, dan kun je niet meer toe met Excel-sheets. Niet alleen mis je dan het totaaloverzicht, het is ook veel te kostbaar om alle informatie in spreadsheets up-to-date te houden,” aldus Robert Dorenbusch. Hij is manager ICT binnen de Oad Groep. "Een professionele planning is een eerste vereiste, zeker als er gemiddeld 50 medewerkers aan 30 projecten gelijktijdig werken." Als oplossing koos de Oad Groep voor Microsoft Enterprise Project Management (EPM), bestaande uit Microsoft Project Server, Project Web Access en Project Professional. Naar producten van andere leveranciers is slechts kort gekeken. Robert Dorenbusch: “De Oad Groep gebruikt al vrij veel Microsoft producten en EPM laat zich daar goed mee integreren. Zoals met de portals van Microsoft SharePoint Server, een perfecte plaats voor de opslag van alle documentatie, die bij de diverse projecten hoort." DBS Project uit Amersfoort is voor Oad de Microsoft Certified Partner. DBS Project heeft de afgelopen vijf jaar bij zo’n zestig bedrijven en organisaties consultancydiensten geleverd voor Project Server. Al vanaf de uitrol van EPM profiteert de Oad Groep van de vele voordelen die het pakket te bieden heeft. Behalve meer inzicht, betere factuurcontrole en tijdige sturing noemt Robert Dorenbusch nog een vierde voordeel, namelijk de toegankelijkheid: “We werken met een Roemeens bedrijf dat voor ons applicaties ontwikkelt. Voor hen is EPM via een webinterface goed toegankelijk, evenals voor andere externe partijen met wie wij samenwerken. Via de integratie met Active Directory worden zij geautoriseerd en worden ook hun werkzaamheden in EPM geregistreerd. Daarmee is het beheer van de projecten voor onze projectmedewerkers en –managers een stuk eenvoudiger.” Dat de projectdocumentatie overzichtelijk wordt opgeslagen is volgens Robert Dorenbusch een pré: “Zodra je een nieuw project opstart in EPM wordt een SharePoint projectomgeving aangemaakt, waar automatisch alle bij dat project behorende documenten worden opgeslagen." Als Robert Dorenbusch een ruwe schatting moet maken over de efficiency dan verwacht hij dat die met vijf procent is toegenomen. “Op een totaalbedrag van 20 miljoen euro is dat bijzonder veel."
          More Spring reading        

Hi folks, here's a nice, juicy reading list for that rainy Saturday afternoon. Well... it has stopped raining here but that should not stop you from reading!

Java

Slightly more hard core Java

Java in the future

A little bit of non-Java

Kubernetes

Systems, data stores and more

Time series

Some fun stuff

Until next time! Ashwin.

                  

Entrants List as of 3pm(PST) 2/25/11


The race is now full! I will be taking names for the waiting list for until the weekend before the race and i'll notify the runners on the waitlist on monday march 7th whether or not they're in. Or if you missed out on this one it's not too late(yet!) to sign up for the next Rainshadow Running race!



          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet by Zack Covell        
Hello everyone after searching online for hours last night for a LIVE Cryptocurrency Spreadsheet and finding the one made by Pablo Yabo above in March 2017 I realized that his spreadsheet wasn't at all what I am looking for. *So I figured I'd solve the problem for the Steemit Community myself.* I'm now building out a LIVE @CoinMarketCap Spreadsheet that we can use for daily snapshots of all 100 Cryptocurrencies. **The spreadsheet automatically refreshes every 5 minutes!** When I'm done I'll invite each of you so you can use it how you wish, or we can track certain digi-currencies together. _______ I'm Using the =CRYPTOFINANCE call within the document I'll show you how to build your own with an upcoming Steemit post in a few days. =CRYPTOFINANCE("XXXYYY", "marketcap") =CRYPTOFINANCE("XXXYYY") =CRYPTOFINANCE("XXXYYY", "total_supply") =CRYPTOFINANCE("XXXYYY", "volume") =CRYPTOFINANCE("XXX", "change", "1h") =CRYPTOFINANCE("XXX", "change", "24h") =CRYPTOFINANCE("XXX", "change", "7d") XXX is the coin, e.g. BTC YYY is the fiat money, e.g. USD _______ Also see: How to get crypto-currencies rates and more in Google Sheets https://jbuty.com/how-to-get-crypto-currencies-rates-and-more-in-google-sheet-1a57e571bc14 What other ideas can we come up with to make this Spreadsheet more AWESOME??? ***NOTE: the #ERROR in certain cells is because the Cryptocurrency has to be only 3 characters, but some are more, for example; DASH has 4 letters so certain cells cannot go live. Here's a sneek peek of the Live CoinMarketCap Spreadsheet: <a href="https://s3-us-west-1.amazonaws.com/cryptocurrency-steemit/LIVE+CoinMarketCap+Spreadsheet.jpg" rel="nofollow"></a>
          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet (Update 2) by Zack Covell        
After searching online for hours last night for a LIVE Cryptocurrency Spreadsheet and finding the one made by Pablo Yabo above in March 2017 I realized that his spreadsheet wasn't at all what I am looking for. *So I figured I'd solve the problem for the Steemit Community myself.* I'm now building out a LIVE @CoinMarketCap Spreadsheet that we can use for daily snapshots of all 100 Cryptocurrencies. **The spreadsheet automatically refreshes every 5 minutes!** When I'm done I'll invite each of you so you can use it how you wish, or we can track certain digi-currencies together. _______ I'm Using the =CRYPTOFINANCE call within the document I'll show you how to build your own with an upcoming Steemit post in a few days. =CRYPTOFINANCE("XXXYYY", "marketcap") =CRYPTOFINANCE("XXXYYY") =CRYPTOFINANCE("XXXYYY", "total_supply") =CRYPTOFINANCE("XXXYYY", "volume") =CRYPTOFINANCE("XXX", "change", "1h") =CRYPTOFINANCE("XXX", "change", "24h") =CRYPTOFINANCE("XXX", "change", "7d") XXX is the coin, e.g. BTC YYY is the fiat money, e.g. USD _______ Also see: How to get crypto-currencies rates and more in Google Sheets https://jbuty.com/how-to-get-crypto-currencies-rates-and-more-in-google-sheet-1a57e571bc14 What other ideas can we come up with to make this Spreadsheet more AWESOME??? ***NOTE: the #ERROR in certain cells is because the Cryptocurrency has to be only 3 characters, but some are more, for example; DASH has 4 letters so certain cells cannot go live. Here's a sneek peek of the Live CoinMarketCap Spreadsheet: <a href="https://s3-us-west-1.amazonaws.com/cryptocurrency-steemit/LIVE+CoinMarketCap+Spreadsheet.jpg" rel="nofollow"></a>
          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet by DenCowboy        
I switched from api provider: example https://api.cryptowat.ch/markets/coinbase/btceur/price don't know if it's useful for you
          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet by DenCowboy        
Thanks for the info. Hope it will be fixed soon
          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet by Luke Stokes        
Yeah, I emailed them again and it seems they had to put in more rate limiting. I hope they go with some kind of token based API approach instead of the IP based banning.
          Comment on Live Updated Cryptocurrency Investment Tracking Spreadsheet by DenCowboy        
I have an issue with #VALUE! and error: An array value could not be found. It seems to be random. Sometime it pops up and sometimes it's gone. Somtimes it's voor BTC, sometimes for Ethereum,...
          part number / SKU retrieval by crcentre        
I have 4055 existing SKU's that require inputting missing information into a spreadsheet that is available through an existing website. Spread sheet is partially complete approx 50%. Not all information will be available on this particular site so please leave blank if information not available... (Budget: $30 - $250 AUD, Jobs: Data Entry, Data Extraction, Excel, Internet Research)
          Fill in a Spreadsheet with Data by DEANLAHAM        
Collect information on internet and copy paste them to excel (Budget: $30 - $250 CAD, Jobs: Data Entry, Excel, Internet Research)
          20 T-shirt Designs - Phase 2.0 by priceless2u        
Hello, I am in need of 100 t-shirt designs in the next month. Your task will be pretty basic, I will send you a spreadsheet with 100 ideas of t-shirts I want to make and you make them yourself in your own unique style... (Budget: $2 - $8 USD, Jobs: Graphic Design, Photoshop, Photoshop Design, T-Shirts)
          Do some data entry by anaisgi        
I have some work, in an Excel spreadsheet. We have a database with about 150 registers with different informations (numerical and text). The main task is perform some statistical analysis & correlations between data... (Budget: $30 - $250 USD, Jobs: Excel)
          Union County Employees / Retirees        
New Jersey has updated their pension data which includes listings of all retirees with the amount of their annual pensions as well as those currently employed with their salaries. You can view  spreadsheets listing all 16,330 retirees and 20,878 current employees related to Union County by clicking the link union county employees/retirees above. Here are […]
          Shots Taken per Goal        
In a battle along the half-boards a winger kicks the puck back to the blue-line. The defenseman throws it across the ice to his partner, who settles the puck down, takes a stride towards the net, raises his stick and sends a shot .... three feet wide and a foot too high, right off the windshield.

Missed shots tend to drive me crazy.  I've been known to fire off a quick frustrated tweet when someone (usually named Kris Letang) can't hit the net. And when I'm at work reminded myself how poor of a hockey player I am, there is nothing more likely to throw me into a Bruce Bodreau-esque tirade than not being able to put the rubber on net.

This isn't something the NHL takes into consideration with it's shooting percentage numbers. They simply divide a players Goals by Shots on Goal. Something Alexander Ovechkin, with a league leading 66 Missed Shots, might be alright with, but something surely Daniel Briere, who has a mere 27 Missed Shots, would feel is misleading.

And so here I've thrown together the On Goal % and Shots Taken per Goal statistics.

I have added Shots on Goal and Missed Shots to create Total Shots Taken. Based on that number I divided a players Shots on Goal by Total Shots Taken for the On Goal %. Then I divide Total Shots Taken by Goals to determine how many shots on average a player takes for a goal.

For example, Steven Stamkos scores a goal for every 6.6 shots taken. 74.4% of those shots taken are on goal.


My Thoughts:

- It comes as no surprise that Claude Giroux tops this list. As a Penguins fan I've seen more than enough of Giroux as he develops into an all-star caliber player

- Crosby and Stamkos. After cooling off for a bit, Stammer is back at it. Just maybe, with them nipping at each others heels, they can push each other further, like Magic and Bird. 50 in 50 looks out of reach, but someone, or both, might score 70 goals for the first time in 14 years.

- Even not accounting for the bevy of frozen rubber he's thrown wide of the net, Ovechkin's shooting percentage has taken a big hit so far this season.Taking all those missed shots into account, it isn't pretty for Ovechkin. You've gotta scroll for a while down that chart before you find his name.

- Same goes for Evgeni Malkin.

- As an Eastern Conference fan, I don't get to see nearly enough games from teams out west. This means I'm usually cautious of pontificating on guys that I haven't seen a ton of. That said, let's agree that Chris Stewart needs to take more shots. He is right behind Crosby and Stamkos at 6.9 Shots Taken per Goal, and he's put an insane 86.6% of his shots on net.

Complete stats available in a Google Spreadsheet
          Why you should consider a Windows Phone for your next phone - part 2        
In part 1 of this article I outlined a few of the reasons why I am really starting to enjoy my Windows phone.  In this followup I'll continue that and describe a few more of those reasons.  I'll reiterate here that this is not a review of Windows Phone 8 nor is it a treatise on why it's the best phone OS.  I happen to think that each of the 3 major phone systems are great and have their target audience.  I'm only intending to outline what makes me smile about Windows Phone.


Office and Sky Drive


The next area I'd like to highlight is Office.  Like it or not, the world runs on Microsoft Office.  My company slings around Excel and Word documents.  My daughter complained the other day that she didn't have Office on her computer and that meant she couldn't interact properly with her college professors.  Office runs business, plain and simple.  And Windows Phone has Office built right in.

Samsung and the other Android phone makers often do include office suites that do a remarkably good job with office compatibility however all it takes is one bad experience with formatting or losing a page or two in your power point for you to realize that "almost 100% compatible" can be very frustrating.

You combine Office with the seamless integration with SkyDrive and you get a very nice mobile workplace.  I'm not spending much time talking about Office as it really "just works" and is one of the best spreadsheet and document editor experiences you'll find on a mobile phone.  Where it really shines is when you mix in SkyDrive.

When you open up Office on the phone you are met with a Recents list that spans documents on your phone or in the cloud.  No differences are made and opening a document from the cloud will check that you have the latest before opening and automatically save it back to the cloud when done.  You don't ever have to worry about manually syncing a folder.  Trying to set up something similar on my Galaxy is harder.  You can install an office suite (like KingSoft) and even open documents right from sites like Dropbox however saving the files didn't appear to push them back to Dropbox automatically.  Yes, I could setup some auto syncing of Dropbox with a folder but none of this is automatic and can be challenging for a new user (like your mom!) to setup.  And, even once you got it running, you still have something that is "mostly compatible" with Office.

Being 100% compatible with Office and seamless integration with the cloud makes Office and SkyDrive a killer story for Windows Phone 8.  Windows 8.1 is making the story even sweeter with even better integration of SkyDrive with Windows on tablets, laptops, and desktops.  Installing Office on these other computers now means that you can edit your Office files wherever you are without any concern about breaking compatibility or that you are editing an older version.  Peace of mind is worth a lot!


Photo Integration, Automatic Uploads, and Live Tile


Windows Phone does a great job of pulling photos together from several different sources into a single location.  Everyone manages their photos in a different way.  On my Lumia you can go into the Photos app and  choose Albums and you'll see all the photo albums I have on my phone, on my SkyDrive, and on Facebook all in one location. No need to open each of these apps separately.  And, while it hasn't been utilized a great deal, I think other apps can take advantage of this as well.

The next call out here is automatic, full resolution, upload of your photos and videos to SkyDrive.  It's a ton of fun to take a bunch of pics, come home and grab some dinner (while your phone uploads all the stuff you shot as soon as it hits your WiFi), and then grab your Surface RT or tablet and swish through all the shots you got right there on SkyDrive.

The last thing I want to mention here is how the Photos app updates the Live Tile.  It's always fun to unlock you phone and see a fresh, rotating, set of pictures from your camera roll right there on the Live Tile.


Facebook Integration


There is a lot to say about Facebook integration in Windows Phone so I will just highlight a few of the areas that I particularly enjoy.

In the Me tile I can post a status update to all of my social networks at one time.  I can update my status on Live Messenger, Facebook, Twitter, and LinkedIn all at one and in one place.  This is great time saver.  Yes, I know that Android has apps that can do this but anything integrated and built in is better in my book.  Also in the Me tile I can Facebook check in and see all my Facebook and Twitter notifications and interactions.  Very handy!

I've already highlighted that I can see all my Facebook photo albums just by opening the Photos app and looking in my albums.  However, as you can see in the photo above, the Photos app has a What's New section that shows you all the photos that your Facebook friends are posting.  Want to see that cool photo  your sister posted this morning?  No need to open Facebook. Just open Photos and hit What's New and there it is!

One pet peeve I have of the Facebook app on Android is that I keep having to search for my family members I want to tag.  No so on Windows Phone.  Right from the picture I can choose share to Facebook, then click the add tag button to see a list of most recently used tags.  No need to search.  Love this feature!  And if I want to do something more complicated I can always just crack open the Facebook app.

The last thing I want to mention about Facebook integration is one of favorites.  I originally didn't think I would like it but boy have I changed my mind.  Windows Phone allows you to specify that an app will control what the lock screen looks like and Facebook supports this.  When you install Facebook and run it the first time you are given the choice to have Facebook manage the lock screen and how it should look.  Now, every time I wake my phone I'm greeted with a new photo right out of my Facebook photo albums.  I really can't tell you how many times I've chuckled or smiled at a photo that was on my lock screen.  It rotates them many times throughout the day too so it's always fresh.

Contact Handling


Contact handling on Windows Phone is really quite nice.  It has all the same features you would expect such as grouping contacts from Facebook, Google, and others into a single directory, the ability to set custom ringtones for a contact, and edit details like birthdays, spouses, etc.  However there are a couple things that really make it stand out.

The first is contact grouping.  You can create groups of contacts with a given name.  Once you have the group you can then go into the group and email them as a group, SMS to them as a group, or see what they have been posting to their Facebook or Twitter accounts as a group. You can view their shared photos as a group.  You get all the same functionality as when you are looking at all your contacts but it is filtered down to just that group.  This can be very handy!

And my favorite is contact profile pics from social networks.  Yes, I know that some Android phone makers have done this for Facebook but I have never seen anyone do it with Facebook, Twitter, and LinkedIn and have that information flow everywhere in the system including SMS and email.  It's very cool to get a phone call from your wife, see her picture full screen on your phone, and realize that she has changed her profile picture.  Yes, I can manually set her contact photo to anything I like but I enjoy getting having my contacts set their profile pictures.

And that wraps up part 2.  In the next (and likely last) installment I'll finish up going over my favorite features with two of the best, battery life and voice commands.  Talk to you soon!

          February Review        
February continued to work out very well for me. I'm not entirely sure anymore if it was just the super soft site I was/am playing on or something I changed in my game while simultaneously switching sites. I've put in some hours at Stars and FTP again when my fishy Euro site was down or low on traffic without any difference in winrate.

Granted I've played all of 54k hands this year due to 6 tabling the Euro site for most of it, so it's not much of a sample for winrate, but I may just start putting some more hours and hands in for FTP Rush so we'll see how it pans out. I'm definitely liking how fast I can jump back up the FTP rakeback ladder so quickly. With a few inexpensive mods, I've found I really don't mind the software at all.

Back to results, I find myself being much more appropriately aggressive after putting in some fundamental work on the math behind my actions and my opponents' ranges. For the past 2 months I've felt like I've been "in the zone" when it comes to making good decisions and hand reading like a fiend.

I've had moments of clarity like that in the past, but they'd always been fleeting, eventually leading me back to my breakeven play style. Two weeks here, three days there. I don't know if this will continue, but every few days I find myself thinking back to that basic off the table study I did at the beginning of January, sometimes running scenarios through my spreadsheets, and it continues to work for me. I think I'm a much tougher opponent to play against for it.

What I'm definitely most proud of so far this year is my play from the blinds. It's easy to win on the button, but I think the loss rate from the blinds really shows who feels on top of their hand reading game. Having loss rates of -2 in the SB and -26 in the BB is a huge factor in my overall winrate so far. I have historically run at about -20 and -45, respectively.

It's taken me a very long time and I've done similar work before, but I'm finally fully accepting that the math does not lie, no matter how counter-intuitive it may be at times. For me, the battle between ego and theoretical strategy is a big one at the table. I need to keep that in check. All of your study amounts to nothing if you're going to be making spite calls and spewy shoves.

March has started where February left off. I'm going to have a conservative goal of 40k hands. The game plan for site selection is going to be FTP/Stars and primarily Rush/Zoom due to the increased traffic while my winrate sustains.


Jan.  34 buyins |  16k hands   |    21 bb/100*
Feb.  44 buyins |  32k hands   |   14 bb/100
* modified from last review, forgot to include Stars results.
          Guess Who? Beginnger Strategy Analysis        
My daughter has recently taken a liking to hauling out all of my old board games. The favourite so far is Guess Who? Due to my consistently being demolished the majority of the time and having victory snatched away from me with one remaining piece by 1/8 Hail Mary chances many more than 1 out of 8 times, I decided to look at the numbers to try to gain an edge.



I'm choosing to ignore all of the potential meta-game, question combining and deceptive game play strategies available. She is not yet 5 years old after all. But I thought there must be some statistical edge to gain through the simple, common questions.

The Setup

  • 24 Characters.
  • 9 Common traits: Bald, Bearded, Eye Colour, Gender, Glasses, Hair Colour ,Hats, Moustachioed, and Nose Size.
  • Traits are split relatively evenly at 19:5: 5 with glasses, 19 without glasses, 5 females, 19 males, 5 with large noses, 19 with small noses.
  • There are two exceptions: Only 4 bearded characters and 4 with brown hair.
  • Traits are split unevenly between characters, ranging from 1 to 4 uncommon traits each.
  • Each player chooses one character at random out of a deck of cards.
  • *Note I am also ignoring the fact that, according to 4 year olds "Oh, I/you already had this one! That's silly! [draws a new card]." Also, some hair is distinctly yellow, not blonde.
  • The game proceeds by a process of elimination.


The result of inputting all of this data into a spreadsheet cross-referenced by character quickly shows that there is a sum of 63 uncommon traits among the group: (5 each among 7 of the non-hair groups (35), 4 of one (4), 4 hair colours of 5 each (20) and one hair colour of 4(4). Thus we can already determine that the first guess (assuming we always get a 'no') will yield 5 eliminations for every guess except questions about beards and brown hair which will result in 4 eliminations.

Keep in mind that your character will skew the results, essentially acting as blockers. Therefore we should avoid guessing character traits that our character possesses. For example, if our character has black hair and we ask if our opponent has black hair, we are only eliminating 4 characters in practice (again, ignoring the meta-game of optimal elimination vs bluffing strategies employed by experienced players).

However -- and this is where the true genius of Milton Bradley's board game creation ability shines by balancing luck and skill -- there are small, yet distinct statistical advantages to gain in the combination of your first two guesses.

Top 10 Ranked Questions:


Worst 10 Ranked Questions:


It becomes immediately obvious that it's not a good idea to ask first questions about beards or brown hair or the traits our character has blocked as they are all under represented, as well as second questions where the first eliminated some of those traits already, such as gender and hats, as a 'no' to female already eliminates two hats (Maria and Claire).

For the sake of being meticulous, lets look at an in practice example. You draw George with uncommon traits of white hair and a hat. Theoretically, if all cards were in the deck, asking about hats and white hair would yield a 24/9 result. However, in practice you have already eliminated 2 traits and 1 character from the group as blockers and have a true result of 22/8. This makes asking other 24/9 questions such as glasses and black hair superior to the question involving traits you own. Likewise, this drops starting questions about baldness and hats off the top of the starting list.

Understanding the best starting questions is a little more difficult as it's hard to know at a glance which combination of traits applies to the largest group of characters, especially as the game progresses into the 3rd and 4th turns. Therefore the best thought process seems to be thinking through and eliminating your worst options while memorizing the top five or ten theoretical guesses until you have a better understanding of the top starting guesses versus your character blockers.

I'm hoping this analysis will give me and all of the readers out there with unscrupulous pre-school opponents the edge we need to win. Good luck.
          Updated: Nate Silver and the Election        
Update - Nov 7: Results posted below.

I've been following Nate Silver's numbers on the 2012 US presidential election. Simply fascinating. If he's right, and I'm led to believe his predictions are scary-accurate, this is not a race at all despite how badly the news wants to portray it as one.

They have all the predictions posted on FiveThirtyEight.

Not being American but still wanting to play along, I compiled his state by state predictions into a spreadsheet I'll be using later tonight. I'm actually much more interested in how close Silver's estimates will be to results than I am in the actual election at this point. You can download my spreadsheet below if you want to play at home, too.

Election Spreadsheet

Just enter the results in the Obama and Romney "Actual" columns as they roll in and you'll be able to see the difference between results and Silver's numbers.

Results

With 90%+ of results in now, I am both amazed, yet not surprised, at the results of Nate Silver's predictions. I've learned previously to take poker players proficient in statistic's predictions seriously, especially when they come out with a 91% confidence rate.

Florida still hasn't been given to Obama, but assuming he retains his slight, predicted lead there, Nate will have picked every single state correctly.

I changed my spreadsheet a bit to give me a Vote Share Accuracy (100% - absolute value of the % difference) column instead of just the % Difference from predicted. Not only did Nate pick every state correctly, his Accuracy with respect to vote share by state averages out to 96.8% with the biggest outlier being West Virginia where Obama received 14% less support than expected. He had 17 states where his predictions were 99%+ accurate.

You can download a completed version of this spreadsheet below.


          FTP Relaunch VIP Preview        
Full Tilt Poker will be relaunched on November 6th. They have recently released more information regarding their new VIP program "Edge." I am contemplating a move to FTP, given my current Silver/Gold volume on Stars.

This is my comprehensive breakdown and comparison of the FTP program. Stars has a comprehensive breakdown of their VIP and Milestone tier levels complete with effective rakeback percentages here.

Edge

Similar to Stars, there will be a VIP tier system based on how much you rake called Edge. Whereas Stars uses monthly and annual measures of VPP accumulation, FTP will use 7, 30 and 100 day rolling averages of FT-Point accumulation. FT-Points will be earned at a rate of 10 points per dollar raked under the Weighted Contributed model.

There are varying tiers where you will be paid back at different rates/100 points on a weekly basis, with the top tier being $2.50/100 points for 25%. These weekly payments do not affect your FT-Points at all, and you retain all FTPs to be used in the store on Ring or Tourney tickets, and cash bonuses for Diamond tier players.

I've put together a spreadsheet encompassing the Edge program which I've posted below.

  • At the top you will find a duplication of the Edge Status Requirements along with the equivalent daily rake paid averages.
  • Beside that is a breakdown of the Ring Game and Tourney tickets currently available in the store. The ticket values used in the rest of the spreadsheet are an average between the cash and tourney values that players with those annual volumes would typically buy (this is a marginal, less than 0.5% difference no matter how you put it together). I have not included any unknown freerolls in this.
  • The bottom three sections provide breakdowns of the tier based rake requirements for each set of rolling averages, the effective rakeback percentages, and a comparison to the approximate correlating tier on PokerStars with my estimates on average new vs. continuing effective rakeback percentages from the Stars VIP breakdown sheet.

As you can see, the FTP tiers and their rakeback percentages correspond fairly well with the Stars tiers based on the same rake paid for the most part.

There are however a few differences to note:
  • New or returning players can get up to speed much more quickly, being able to reach an intermediate tier within 7 days compared to 1 month on Stars, and to the top tier within 30 days as opposed to up to 1 year on Stars.
  • Over a longer period of time, it requires fewer and fewer FT-Points to maintain your tier status as opposed to Stars fixed tier system.
  • Using the rolling averages instead of monthly or especially annual calendar programs allows players to jump in and start receiving their maximum benefits right away, any day of the week or year instead of waiting until the first of the month or January 1st.
  • While the top tier is certainly easier to attain than the top Stars tiers (Elite and Supernova), keep in mind that Diamond is still the equivalent of somewhere between Stars's Gold and Platinum when you take rake and rakeback into account. If you are a Supernova+ volume player, there is no equivalent FTP tier at this time and you will max out at 29.9% fairly easily.
Based on my current volume of play and my inability to regain Supernova status by the end of this year, I will be giving FTP a shot and re-evaluate both programs for January 1st, given that FTP is not completely reg infested due to fear and misunderstanding of recreationals. I'm hopeful that the rolling average requirements and weekly cash payments should help to smooth things out for people that prefer the rakeback system.

          Micro Millions Schedule        


I'm planning on playing a few Micro Millions events on Stars over the next couple of weeks with my time off. The schedule looks fairly intense at 100 events over 10 days so I put together a bit of a spreadsheet to keep track of it.


You can download a copy here: http://dropcanvas.com/s0lq4

All you have to do is use the colour coding to find the formats you like, then put an "X" in the Play column to highlight that event on your list. The Event # column will also update itself to follow the date so you will know which events are being played that day.

Edit: I've added a new version (1.1) available at the same download link above, after a request that adds columns for rebuys/addons and $ won. Simply enter the number of rebuys and it will automatically add it to your total series buyin at the top of the sheet.



The sheet is protected so you won't mess up formulas by accident, but there's no password to unlock it if you'd like to make changes or additions.

Enjoy!
          Poker Rust -- Possibly Burnout        
I haven't updated in a while until my previous post today regarding the WSOP Ladies Event (see below). I've had this nasty cold/flu that's been lingering for the past 6 weeks. I've been completely exhausted, although I am actually starting to feel much better this week.

It kind of put a damper on my poker volume this month. I believe there was a 6 day stretch in there where I didn't even sit at my computer at all nevermind log in to Stars. I probably haven't spanned more than 36 hours between log ins at any point in the last 5 years until this month. I just didn't have the will power to put in any sessions.

My break was somewhat mentally refreshing, but now that I'm back at it I'm feeling either a bit of rust, or perhaps it's cash game burnout. Either way my tilt meter has been off the charts and I haven't been able to get back in the zone I was in the past two months and decided I needed a change of scenery.

As per usual when I feel like this, I somehow immediately think PLO is the interesting choice only to be quickly reminded that this is not a game for someone looking for stability. So I've been dabling in SnG's again. I think what I like about them is that it negates the waiting game that you usually experience in cash games with rising blinds forcing the action. I like MTTs for the same reason but the time committment with usually nothing to show for it is a game breaker for me. I also seem much more capable of compartmentalizing a beat in an SnG as well and not letting it affect my other games.

I started with the $7 turbo 9-mans and ran like absolute death. I don't know how accurate or to what degree to trust the HM2 Luck Adjusted Winnings, but over my first 50 games (lol sample size) I was at +3% EV while at -65% reality.

I then decided to drop down to the $3.50 turbo 6-mans. I always did reasonably well converting FPPs into T$ in the 235 FPP Sunday Million satellites before Stars made the cash bonuses much more managable (yay result of my March IOM meeting presentation! toot toot!) and these are the closest thing I could find without playing the actual hyper-turbos as I wanted something with a bit more depth and ability to multi-table. I instantly flipped the switch and ran like god. My EV line is matching my results but I started getting dealt premiums left and right and getting them AIPF and holding a lot. Currently running at 26% ROI in those games over about 100 so far. I think I'm going to stick it out there for 1000 games and then move back up.

In that short time span the one thing I've learned so far is to really get a feel for how the final three is going to play out on top of the standard push fold strategy. With the $7's I think I was much too happy to push a tiny edge against wide range donks and maniacs only to end a ton of them in 3rd place on a 55/45. I've realized for this level I can abuse the weak tights to no end, but can sit back and outlast the shove calling fest of the people donking around and then outplay HU and tailor my three handed play accordingly.

I've created a spreadsheet with every 6+ man NLHE SnG variant which I'll add later tonight. Super handy if you're trying to figure out where to start by filtering for game formats, speeds or buyins to see what's available in that stream as you move up.

I'm sure I'll inevitably go back to cash games assuming the VPP fever catches up to me again. It just seems like right now games are super nitty and there's not much running between the European segregation and Zoom splitting the pool. So I'm not finding cash all that enjoyable which inexplicably makes me want to force action against nits -- a bad idea leading to tilt and punting more stacks.

Ideally I would like to put in a solid 5k games a step or two up the ladder before deciding which way I want to go.
          New Trip to IoM        
I haven't posted in a while since not a lot has been going on but I wanted to post this and get some input from you guys out there reading my blog.

I have been invited to the second 2012 meeting with Stars on the Isle of Man and have accepted the offer this time. I am not going as a player rep and don't plan on participating in any sort of forum flame war when I return. I'm simply going as an unelected individual player.

However I would really appreciate input from everyone on what your issues and struggles are as well as any ideas for improving yours or others' experience playing on the site. Anything and everything from software to VIP program to game function etc is more than welcome. I do think the individual rake per stake issue has been talked to death and unlikely to change in the near future, but I would still more than welcome any comments or ideas on the overall rake and VIP structures as a whole.

You can use the Google Docs form I created to leave your comments.


.
          Preliminary Rake Opinion        
Long time no blog.

POKER
Not much going on. I've been taking it pretty easy with poker. I've been way too distracted with rake change discussions. The IoM trip is over and reps are back and unfortunately getting flamed for making counter-intuitive statements after having seen Stars' data. I thought this was inevitable given the way the community tends to work which, apart from the unecessary delay and feeling like not much would be added than what was originally offered January 1st, was precisely why I turned down an invitation to go. I felt the logical outcomes were going to be one of:
  • Stars is telling the truth and we come back with the same as January 1st. This is the basically the reps' version although they say that Stars is actually taking a bit of a hit now in addition to making large maintenance concessions for this year as a gift for SN/SNE as a good faith measure for handling things so poorly in the first place.
  • Stars is not being truthful, would have a short chat with the reps and send them back not being able to say anything under an NDA and end up with the same as January 1st.
  • Stars is not being truthful and would present false data to the reps, sending them back with a good impression and ending up the same as January 1st. This has been the outspoken response by a significant portion of the community, effectively shooting the messenger which is only made worse by the fact that the reps were put in charge of making the annoucnement and therefore they put a target on themselves. These are Stars' decisions and they need to take the credit good or bad.

The one huge red flag that goes up for me is the statement that Stars was completely unprepared for the meeting and that they went through the raw data for the first time with the reps. It just sounds too unbelievable to me that a bunch of incompetents stumbled their way into a full scale monopoly on the industry without any player valuation metrics or revenue forecast tools, effectively being taught how to analyze their own data by the player reps and through trial and error for the first time at the meeting and relying on customer service up to this point. I would tend to think a company that large has some of the best number crunchers in the industry on their payroll.

That said, I do think the reps did as good a job as possible given the situation they were put in and I'm grateful to them for that.

Despite what has transpired here, and we're not completely sure yet as we're still waiting on official numbers from Stars, I would still be more than willing to attend a meeting in the future where the discussion isn't quite as urgent as the beginning of year changes require and now that people have a better idea what to expect out of them.

HEALTH

In non-poker related news, I'm trying to get back on the healthy eating bandwagon. I started at 238 lbs last summer and managed to drop 30 lbs by Novemeber. Unfortunately the holidays were not kind to me and I gained back 10 lbs. My goal is still to get down to a much more healthy 180 lbs which is about what I was in high school.

I stumbled across MyFitnessPal the other day and it's really opening my eyes to a lot of things and the app is super user friendly. My achilles heel is obviously Tim Hortons. Bye bye 500+ calorie breakfast sandwhiches known as the best tasting breakfast food ever invented, 150 calorie double-doubles and 300+ calorie fritters. Hello 50 calorie Splenda+Milk coffees and the occasional 220 calorie yeast based doughnut. Of course I could just drink my coffee black like I do at home, but I'm more addicted to that specific flavour that you can only get by mixing theirs with copius amounts of sugar and cream.

You hear a lot of people out there talking about just having to control portion sizes. While that may be true in that you can basically eat anything as long as it's small enough, it doesn't work for me. I just end up feeling hungry and it ends up not being worth it and discouraging. The key for me is going to be finding decent enough tasting food that's low in calories so that I can eat a better sized portion and say "close enough." I've been obsessed with MFP's food database for a few days now and am starting to slowly fill out a spreadsheet with substitutes:

  • Boston Pizza is out. Their cactus cut potatoes are definitely out. If I want pizza, thin crust Delissio or Dominoes is the way to go.
  • If I drink soft drinks it's going to be Coke Zero or Diet Dr. Pepper, both of which taste remarkably close to the real thing. But I am going to try to stick with my Crystal Light singles packs more than anything. Trying to drink straight water in sufficient amounts is a lost cause for me. I need flavour.
  • I've been avoiding McDonalds like the plauge since last summer. After going through a number of restaurant's menus over the past few days, they actually seem like the better deal cost and calorie wise if you stay on their value picks menu considering the sizes of the meals that you get everywhere else. If I go to Boston Pizza I'm inevitably going to end up paying $20+ and eating 1100 calories between appetizer, salad and pizza which is 2/3 of my daily intake. At McDonalds I can get a McDouble or a Junior Chicken, or even get one of each with a Coke Zero and be at 400-500 calories for the meal under $5 for less than 1/3 of my daily intake. Of course some people are going to be partial to what is considered good food, or at least better prepared food like at BP's, but I've always been somewhat of a processed food freak.
  • Eating at home is typically best overall as long as you stay reasonable there too. You're still better off running out for a cheeseburger if you're going to fry yourself a ham and cheese sandwhich at home. Homemade salads, sandwhiches, chicken, and fish and even hamburger seem best and I'm fairly happy about that.
  • I'm not much for snacks, but I do love chocolate, KitKat Chunky bars to be exact. Chocolate chip granola bars seem like a rather poor substitute but they are a 67% reduction. But I did find Quaker Oats Dipps, which are simply granola bars smothered in a layer of chocolate and I can definitely say close enough for a 50% reduction. The Black Diamond cheese strings my kids like are also surprisingly decent and rather filling and I really do like Sun-Rype Fruitsource bars.
As a general rule based on my spreadsheet so far, it looks like I'll be ok as long as I stick to meal items that are 250 or less, snacks at 150 or less, and beverages that are 100 or less with as many at 0 as possible. That will let me easily come in at 1800 per day and hit my goal weight in about 4-5 months.

As for exercise, I've been looking at the big picture as a simple EV calculation. 20 minutes of jogging is about 270 calories burned, the same as eating a doughnut. The enjoyment I get out of that stupid doughnut is less than the misery I get out of jogging for 20 minutes so maybe I should just say no to the doughnut and skip the exercise. But I do plan on getting some amount of exercise in, I was surpised that some of the numbers for Wii Sports are quite high -- Wii tennis comes in at 400/hour compared to real tennis, which I also enjoy immensely, coming in at 900/hour -- and that's something I could do with my family indoors in winter even though it's not quite as efficient as jogging. But it's still a win/win.

This whole health thing seems to be 80%+ what you eat and 20%- exercise when you look at just how much work is required to burn off small amounts of food so that's going to be my main focus for now. I did the original 30 lbs with zero exercise so I think this theory is correct although from what I understand there is some correlation between muscle mass and your Basal Metabolic Rate (how much you would burn if you stayed in bed doing nothing all day).

MFP had a nice little widget so I've gone and added that to my sidebar so I feel like I have some accountability to keep up to. Hoping to get down to my goal weight by later this summer.
          Creating an "Upgrade Cycle Project"        

You can also create a custom Monitor to detect the Project software. I would not enable email unless you want ~500 emails telling you that the software dosnt exit on machine xxx.

 

But you can do reporting on the alert to give you a spreadsheet of done ot not done machines. Combine with Project/Ticket status for work lists, shuch as machines w/o software w/open tickets assigned to it#1, etc.


          PV Value: Solar Electric Economics Spreadsheet        

"This spreadsheet tool developed by Sandia National Laboratories and Solar Power Electric™ is intended to help determine the value of a new or existing photovoltaic (PV) system installed on residential and commercial properties. It is designed to be used by …


          What Are Some Good Weight Loss Tips? Look At These 5        
What Are Some Good Weight Loss Tips ? Firstly let me introduce myself hi everybody my name is Tim and yes I am a chocoholic. Everybody says that the first thing you need to do is to admit that you have a problem. Well I have been saying that the problem exists for years but guess what? Nothing will get better until you decide to change your circumstances. If you want to know how to lose weight know this: weight loss won't happen until you control what goes kind of food goes into your mouth.

Everyone knows that if you always ingest unhealthy things that terrible things will happen to your body. Well, giving up terrible food isn't that easy at all. Most people don't really think about what they are putting in their mouths. We all need to really take a look at what the purpose of food really is. Food is simply the fuel for our bodies. Think about food like gasoline for a car. If you put gas that has some water in it into your tank or perhaps diesel instead of regular unleaded, your engine will be affected. The same is true for your body. It is a system that burns fuel. But what are the good weight loss tips? We're just about there read just a bit more first.

In the times that we live in we are used to the huge portions that are served up and think nothing of the additives that are added to that food that negatively affect our bodies. Our engines are being damaged!

I for one have finally taken control over what I decide to eat. I now consider why and what I am ingesting. I look at ingredients and I try my darndest to avoid anything that isn't a real food item. Chances are that if you can't say the ingredient that is is not a healthful source of fuel for your body. What Are Some Good Weight Loss Tips ? Following are five easy tips to logical weight loss:

1 Ingest at least 3 normal sized meals per day. I eat good snacks in between meals but only if I am hungry. Make sure to work into your menu normal servings of veggies and fruits.

2 Make a journal or better still an Excel spreadsheet recording your calories for each food item. I have used this to discover a pattern that I had developed. I would limit my calories for 2-3 days and then I would binge. Without the journal I would not have found the pattern so clearly.

3 Physical activity at least 20 minutes per day. Switch up what you do, don't do the same activity every day. Try to work it into your daily routine. If you have to purchase flour, walk or ride your bike to the grocery store instead of driving.

4 I know that many of you will not want to do this but I plot my weight daily. I have a spreadsheet that plots my weight on a graph. I also graph my average weight loss or gain. This way the average chart shows a smoother line trending either up or down while the daily chart shows sudden ups or downs. The averaged chart shows a truer picture of my weight trend.


5 This fifth tip for me is the hardest. I avoid refined sugar. It is very easy to become addicted to refined sugar because it can be found in numerous items, even toothpaste. Since I have started staying away from refined sugar my sweet cravings have diminished considerably. Now if I crave something sweet I tell myself that this is my body telling me that it needs some fruit. I still get to indulge in something sweet but it is a healthy indulgence.

Well there you have some common sense tips for easy weight loss. For a great logical approach to weight loss I would check out this site it has a great method that will keep you accountable for your actions.

Wishing you healthy eating for healthy weight loss.
          What Are Some Good Weight Loss Tips? Don't Forget These 5        
What Are Some Good Weight Loss Tips ? First off let me introduce myself hi everyone my name is Tim and yes I am a chocoholic. Everybody says that the first thing you need to do is to admit that you have a problem. Well I have been saying that the issue exists for years but guess what? Nothing will change till you take control of your situation. If you want to know how to lose weight know this: weight loss won't happen until you control what goes kind of food goes into your mouth.

Everyone knows that if you always eat horrible foods that bad things will happen to your body. Well, giving up unhealthy food isn't very simple at all. Most people do not really think about what they are inserting into their mouths. We all need to definately take a look at what the purpose of food really is. Food is simply the fuel for our bodies. Think about food like gasoline for a car. If you put gas that has some contamination in it into your tank or perhaps diesel instead of regular unleaded, your engine will be affected. The same is true for your body. It is a system that burns fuel. But what are the good weight loss tips? We're just about there read just a bit more first.

In the times that we live in we are used to the huge portions that are served up and think nothing of the additives that are added to that food that negatively affect our bodies. Our engines are being damaged!

I for one have finally taken control over what I decide to eat. I now consider why and what I am eating. I look at ingredients and I try my best to avoid anything that isn't a real food item. The likelihood is that if you can't say the ingredient that is is not a healthy source of fuel for your body. What Are Some Good Weight Loss Tips ? Below are five easy tips to smart weight loss:

1 Eat at least 3 normal portioned meals per day. I eat good snacks in between meals but only if I am hungry. Make sure to work into your menu portions of fruits and veggies.

2 Record in a journal or better yet an Excel spreadsheet documenting your calories for every food item. I have used this to discover a pattern that I had developed. I learned that I would limit my calories for 2-3 days and then I would binge. Without the journal or spreadsheet I would not have noticed the pattern so clearly.

3 Take exercise at least 20 minutes per day. Switch up what you do, do not do the same activity day in and day out. Try to work it into your daily routine. If you have to purchase milk, walk or ride your bike to the market instead of driving.

4 I know that many of you will not want to do this but I plot my weight every day. I have a spreadsheet that tracks my weight on a graph. I also graph my average weight loss/gain. This way the average chart shows a smoother line trending either up or down while the daily chart shows sudden increases or decreases. The averaged chart shows a truer representation of my weight trend.


5 This fifth tip for me is the most difficult. I avoid refined sugar. It is very easy to become addicted to refined sugar because it can be found in a lot items, even toothpaste. Since I have started staying away from refined sugar my sweet cravings have diminished considerably. Now if I crave something sweet I tell myself that this is my body telling me that it needs some fruit. I still get to indulge in something sweet but it is a healthy indulgence.

Well there you have some common sense tips for easy weight loss. For a great logical approach to weight loss I would check out this site it has a great method that will keep you accountable for your actions.

Wishing you healthy eating for healthy weight loss.
          Greg Palast in Ohio on GOP Effort to Remove African Americans from Voter Rolls in Battleground State        

Greg Palast in Ohio on GOP Effort to Remove African Americans from Voter Rolls in Battleground State

http://www.democracynow.org/2016/11/8/greg_palast_in_ohio_on_gop

By: Democracy Now
Date: 2016-11-08

In an on-the-ground report from the battleground state of Ohio, investigative reporter Greg Palast has uncovered the latest in vote suppression tactics led by Republicans that could threaten the integrity of the vote in Ohio and North Carolina. On some polling machines, audit protection functions have been shut off, and African Americans and Hispanics are being scrubbed from the voter rolls through a system called Crosscheck. "It’s a brand-new Jim Crow," Palast says. "Today, on Election Day, they’re not going to use white sheets to keep way black voters. Today, they’re using spreadsheets."

Read more »
          Mere CO2 udledning fra danske kraftværker        
I forbindelse med mit første mashup af CO2 mængden fra de 98 største danske kraftværker havde jeg lagt data ind i et regneark. Disse data har jeg nu genbrugt i Googles Spreadsheet Mapper 2.0. Her dannes en KML fil on … Læs resten
          Anticipating the Release of a New Partner Relationship Management Software Called Distance Quick Con        
http://s370.photobucket.com/albums/oo141/weezieweezie23/CH/ch30.gif
Channel Management

U.S.A. (January 22, 2010) – Software and online developer, Rising Tackle, will be hosting and launching its new, prevalently web-based partner relationship management software tool called Distance Quick Connect on March of this year.

The PRM software is modeled after Rising Tackle’s award-winning Customer Relationship Management (CRM) web based tool Focus Streamline, which was a big success last year, creating a forty percent growth for the company since its launch back in December 2008.

According to the company SEO, Distance Quick Connect boasts of the power of a simple design, a strategy that makes the web based tool very flexible and customizable to specifically address and suit individual needs and preferences of consumers. It is predicted to catapult Rising Tackle on a dominating position in the PRM market this coming year, especially with the ongoing recession.

It has been discovered that relationship management tools such as PRM has a lot to gain from the negative economic climate of recession. A lot of companies, including those who are not dependent on channel sales, are now looking to channel management strategies to survive the downturn of the economy and manage adequate revenue generation for the coming years. This is a fact that many software developers have taken advantage of, hence the release of various PRM tools and applications.

The increase in industry demand prompted companies like Rising Tackle to broaden or expand its services and products to include the increasingly popular partner relationship management. More industries are in need of PRM and PRM software since modern paradigms of global business are increasing in complexity and interconnectivity.
http://s370.photobucket.com/albums/oo141/weezieweezie23/CH/ch21.jpg

This trend has helped channel management gain a steady foothold in the market, as compared to its situation a few years back. PRM in particular was not a popular choice for a business strategy because the methods available for its implementation were limited to manual, outdated and inconvenient. Such methods included spreadsheets, email messaging and phone calls, all of which are now considered inefficient and ineffective, since partners may be located in different parts of the country, or even the world.

Such a disadvantage made communication and collaboration with partners a tasking responsibility, and oftentimes a fruitless prospect.
With partner relationship management software, channels may enjoy real-time monitoring and management, as well as better transparency of channel performance and productivity. Tools and programs will be made more accessible and available to partners, resulting in increased growth and improved sales in both direct and indirect channels.

Doug Sedrick
          Following the union money        
The UAW went down down in defeat yesterday in Canton after it spent a great deal of time and money trying to unionize the plant.  See where the UAW spent it's money in the spreadsheet posted below. 









          (USA-CA-Merced) Support Services Assistant Extra-Help        
The Position Under direct supervision, to develop and prepare a variety of Department applications, information systems, and reports on a microcomputer utilizing spreadsheet, word processing, desktop publishing, data imaging systems, and database software; to be responsible for complex Department and program support functions; to perform a variety of para-professional duties in support of professional staff; and to do related work as required. Minimum Qualifications While the following requirements outline the minimum qualifications, only applicants who demonstrate the best qualifications match for the job will be selected to continue in the recruitment process. Applicants must meet the minimum qualifications by the application deadline. Either, I Experience: Three (3) years of full time experience performing a variety of office and administrative support work utilizing a microcomputer and software applications, such as word processing, spreadsheet, desktop publishing, imaging, and database software. OR, II Education: Equivalent to completion of a two (2) year Associate Degree from an accredited college or university with at least twenty-four (24) units in Office Administration or another closely related field. AND Experience: One (1) year of full time experience which require proficiency in performing a variety of office and administrative support work utilizing a microcomputer and software applications, such as word processing, spreadsheet, desktop publishing, imaging, and database software equivalent to a Typist Clerk II or Office Assistant II with Merced County. To view additional information about the typical duties, knowledge, skills and abilities for this classification, please visit the county website at http://agency.governmentjobs.com/merced/default.cfm?action=agencyspecs Additional Information A valid California driverandrsquo;s license and DMV clearance may be required at the time of appointment. Individuals who do not meet this requirement due to a disability will be reviewed on a case-by-case basis. Applicants for positions within the Human Services Agency maybe subject to a criminal history background check.
          (USA-CA-Merced) Executive Assistant/Contract Specialist - Hospital Administration - Full Time Days        
*Position Summary* The Executive Assistant/Physician Contract Specialist serve as support to the Vice President/Strategy and serves as the Physician Contract Specialist overseeing and coordinating the management and renewal processes for all physician//physician recruitment contracts. Performs difficult and/or sensitive tasks for the Business Development Team and other Sr. Leaders, dealing with sensitive information, recruitment matters, preparation of correspondence, minutes, or projects, manages calendars, processing of purchase orders, and provides personal administrative support. Handle physician invoices as assigned. Assists with family/patient situations and complaints. Acts as back up to Regulatory Administrative Assistant to manage Policies and Procedures according to hospital policy. Duties assigned to this position require the independent development of information necessary to complete job assignments, and the exercise of initiative and good judgment. This individual must be selft-directed, with the ability to exercise discretion and independent judgement in the performance of his/her work. This individual must have excellent interpersonal skills in addition to technical skills relating to database management and use of spreadsheets. *Qualifications Minimum* 1. High School graduate 2. A minimum of 5-7 years of increasingly responsible secretarial or administrative assistant experience, preferably in a healthcare setting. 3. Special technical skills required include the ability to manage (including report writing) a relational database to support contract management functions, 4. Advance knowledge of Microsoft product including Microsoft Word, Excel, and Access. *Desired* 1. AA degree or certificate 2. BA or Bachelors Degree 3. Demonstrate experience managing the renewal process and/or implementation of new contracts including maintaining a filing system and a contract database, and safeguarding the security and confidentiality of such contracts. 4. Substantial knowledge of applicable State, Federal laws and regulatory compliance process, relating to Hospital/Physician contracts, including Stark II laws. Mercy Medical Centerhas been building a rich history of care in our community for more than 100 years. We have grown from a small one-story wooden structure into a major healthcare provider with a brand new 186-bed main campus, offering the latest in facility design and technology. Mercy also operates Outpatient Centers, a Cancer Center and several rural clinics. Wherever you work throughout our system, you will find faces of experience with dedication to high quality, personalized care. Joining our 1,300 employees, 230 physicians and many volunteers, you can help carry out our commitment to providing our community with the excellence they have come to associate with Mercy Medical Center. **Job:** **Administrative Support / Clerical* **Organization:** **Mercy Medical Center Merced* **Title:** *Executive Assistant/Contract Specialist - Hospital Administration - Full Time Days* **Location:** *California-Central California Service Area-Merced-Mercy Merced Community* **Requisition ID:** *1700009134* **Equal Opportunity** Dignity Health is an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, protected Veteran status or any other characteristic protected by law.
          (USA-CA-Merced) Fire Prevention Inspector I/II/III        
Desirables:Advanced technical knowledge within a specific inspection area based on the California adopted codes and regulations; California Administrative Codes Titles 19 and 24, NFPA Standards, Campus Standards, fire protection systems, hazardous materials and other related codes and regulations. Comprehensive and advanced knowledge and understanding of California adopted codes and regulations with the ability to interpret those codes and regulations.Advanced knowledge and skills in completing comprehensive plan review and inspection services for complex construction projects and existing buildings, including a wide variety of fire protection and fire and life safety systems.Strong written, verbal and interpersonal communications skills, including advanced political acumen and skill to deal with diverse constituencies in a highly political environment.Advanced analytical skills and ability to organize, prioritize and manage the successful completion of high-level and potentially complex projects within budget and time constraints.Advance skill to effectively represent the organization to state and federal authorities, and community groups.Experience performing a variety of work with computer database, word processing, and spreadsheet software programs.Experience performing weed abatement related work.Duties may include, but are not limited to the following:Inspects a variety of structures and facilities ensuring compliance with all State and local fire codes. Assist in weed abatements and data entry during weed abatement season. Provide and handout information to building owners and others regarding fire codes and standards, and ensures that questions and concerns are effectively resolved. Assist in hydrostatic water testing of fire protection systems as outlined in the California Code of Regulations Title 19 and NFPA 291 standards. Meet with the general public to conduct fire safety presentations and provide information regarding fire prevention and life safety. Maintains database and invoicing for operational permit inspections. Issuance of various permits as outlined in the California Fire Code Section 105. Perform related duties as assigned.Assist and observe public fire work display inspections and permit review.
          Organization, Preparation Key in Coaching        
Team huddle between innings in our game against Sokol Hluboka last month.
I've always felt that being organized and prepared gives you an advantage in life, whether it's in the classroom, workplace or on the playing field.

I'll freely admit that as a baseball coach, I've worked with and gone up against other coaches who know the game better than I do, know how to coach up and motivate players better than I do, and how to manage a game better than I do. So, while I feel I've closed that gap a bit over my eight years in coaching, I still feel the need to use other strengths of mine to help level the playing field.

I was raised in a military family, and while I did not completely grasp the discipline that enabled both of my grandfathers to serve in World War II and my father to have an admirable 23-year career in the U.S. Army, I do feel I've taken a couple of traits from the military lifestyle, being organized and prepared, and applied them to my coaching.

Those traits, I've found, are necessary more than ever while coaching in Europe. Back in the states, I was a part of high school staffs that usually had at least five and sometimes as many as eight coaches on the field every day. We'd have coaches focusing on the pitchers, the catchers, the outfielders, the middle infielders, the corner infielders, the base runners, the hitters, guys whose main job was just to keep the clock and make sure we remained on schedule with the practice plan, you name it. Even while coaching travel ball during the summer and fall, in what is a much more relaxed atmosphere, I'd have at least two assistants with me at practices and games.

Having that many coaches on the field really helps in our ability to develop a player's individual skills, to maximize practice time, make sure that nothing gets overlooked whether it's a player's footwork when turning a double play or a hitter slowing his hands down on breaking balls away. It's the no-stone-unturned methodology that I learned while coaching under Pudge Gjormand at James Madison HS in Vienna, VA, and helped with implementing when I moved over to assist Morgan Spencer at Centreville HS in Clifton, VA, in 2008.

That frame of mind had to be adjusted when I came over here, as I initially discovered when I hopped on late last summer to help Anthony Bennett out with his Szentendre Sleepwalkers team down in Hungary. During more than half of our practices and games, I'm the lone coach on the field. And that was certainly an adjustment from what I was used to back home. I don't always have an assistant to lean on to throw batting practice if my arm is dangling, to ask for their opinion on potential lineup adjustments, to pass off some administrative duties to, or to hit fungos to the infielders so I can go watch bullpens.

And that's not necessarily a bad thing.

It's forced me - and this may scare some people back home who already think I'm an OCD-crazed maniac at times - to become even more prepared and organized to make sure everything is in order when I arrive at the field and I can just focus on getting our team through that day's practice or prepared for that day's game.

In addition to the usual on-field and off-field coaching responsibilities, I make a point to email weekly schedules to players each Monday, and a spreadsheet is put together of who can and cannot attend each practice or game; practice plans that are detailed down to the minute are put together and emailed to our club's offices to be printed up and posted in the dugout every day; lineups for both games of our doubleheaders are put in our iPad scoring system, on our lineup board that is posted in the dugout and on the lineup cards the night before our games; heck, I even map out best- and worst-case pitching changes and substitutions for each game as I know I sometimes won't have the time to think them out on the fly while trying to manage everything else that's going on during a game.

In practices, I've tried to stress the importance of our guys focusing during drills so we're able to get through them more quickly, to sprint on and off the field as that saves time and keeps the Baseball Gods happy, and might designate outfielders to hit ground balls to infielders, or pitchers to hit fly balls to outfielders once they're done with their bullpens, long toss or running. I've had to reach to find ways to effectively get our team through a quality practice, and it's honestly been fun to try and think up new ways to do so. 

As a coach, you're always telling your players to think before the pitch is thrown so you know where to go with the ball if it's hit to you. You tell them to mentally begin their at-bat while on the on-deck circle, so you can just react and not think once you step into the batter's box. You're expecting your players to be prepared, so my feeling is that you had better follow suit and be prepared yourself.

Not having a pitching coach has forced me to study and learn more about that aspect of the game, which was undoubtedly my weakest trait. I appreciate when my one designated assistant, Ales, is able get away from work and come to practice because I know our outfielders will be getting some good work in that day. And when we do have someone come and throw batting practice, I make sure to take advantage of being able to watch our hitters more closely and help them make adjustments with their swings, or simply be able to give them a pat on the back and tell them how well they're swinging the bat if that is the case.

Don't get me wrong, our club does absolutely everything it can to support me and give me what I need. I've had former players come out to throw batting practice and swing a fungo, I have Ales and our general manager, Jiri, come out as often as they can to lend a hand, and I am always being asked if there's anything I need. But it's simply too difficult to find quality baseball people that can get away from work on a consistent basis to come and help out, and it forces you to make adjustments to how you go about your coaching responsibilities, to become even more prepared and organized, or else you will let your players down because they're not getting developed like they should be.

I feel that if a coach ever becomes stagnant and content with what he is, then it's time for him to get out of the game. I couldn't dig up the quote from Tony LaRussa, but remember reading somewhere him saying that in four decades of coaching the game, he still learned something new about it every day. That's how I'm treating my experience over here. Use it as a learning experience, and hopefully I'll get better at the trade in the process.

At the very least, I'll know what substitutions I'd like to be making in the eighth inning of the second game of our doubleheader that weekend.
          Google Offers Cheap Online File Storage        
If you’re a power user of Google’s Gmail or Picasa Web Albums, Google wants to help you with your storage limit woes. As of today, you can purchase extra online storage that can be used across email, images and soon, docs and spreadsheets.
Here’s what you can get:
  • 6 GB ($20.00 per year)
  • 25 GB ($75.00 per year)
  • 100 GB ($250.00 per year)
  • 250 GB ($500.00 per year)

          How to get a Justin Bieber on board with your nonprofit        


Sara Choe

As you foray into online fundraising for your nonprofit, you might have stumbled upon the term “social media influencer”. You might wonder what that is exactly and where would you even get one of those.

A social media influencer is simply a person whose online presence carries weight. There are many kinds of influencers, but they share the following qualities:

1. Credibility

An influencer has had to earn their audience’s trust in some way, and more trust means longer attention spans.

Being an expert in a field is one way an influencer gains credibility. Physicians like Sanjay Gupta or Dr. Oz are influential – the former as a CNN medical correspondent and the latter as host of the eponymous television show – primarily due to their medical expertise.

2. Reach

Social media influencers have extensive audiences. They have numerous followers across several platforms; sometimes there are overlaps; that is to say, one might follow a blogger both on Facebook and Twitter. For example, author, blogger and speaker Jon Acuff has more than 49,000 fans on Facebook and nearly 186,000 followers on Facebook.

3. Quality engagement

The influencer’s audience is marked by quantity and quality. That is to say, she has many meaningful interactions with most of her many followers. Her followers not only consume her content but share it with their respective audiences.

Searching for Social Media Influencers

Now that you know who social media influencers are, you need to learn how to connect with those who would best advocate for your organization.

Hint: An influencer’s celebrity doesn’t translate into being an effective endorser of your cause. While not impossible, an urban fashionista probably would not be the best spokesperson for a nonprofit focused on promoting sustainable agriculture in the global south. A food critic who writes extensively about organic farming and shopping at farmers’ markets, on the other hand, would be a better fit.

Finding the right social media influencers involves more than just referring to industry and social rankings; you’ll have to do your own research to identify these key people. Fortunately, searching for them is not like trying to find Waldo.

Basically, you’ll be relying heavily on search engines and social media to find and befriend these influencers. Here are our tips on engaging with social media influencers.

1. Search online

Search for content –news articles, blogposts, research papers, trade magazines and books – germane to your field of work. Note the authors and their respective sources.

Chances are you’ve already come across these names if you’re already keeping up your industry’s news. Create a spreadsheet where you can record and keep track of these names.

2. “Stalk” them

Plug these names into Twitter and start following them. Add their handles to your spreadsheet and note other interesting facts about them.

Do likewise on other channels, such as Facebook, Google+, LinkedIn, Pinterest and YouTube. Skim through their information and content, and record your findings in your spreadsheet.

Sometimes, social media channels are search engines unto themselves. Run a Twitter search for industry keywords to find people that your Google (or Bing or Yahoo!) search might have overlooked.

3. Measure their social influence

Use Klout or a similar platform to get detailed, statistical analyses of the influencers’ influence. You can use these tools to find more influencers, too.

4. Determine your targets

Now that you’ve gathered information on the persons of interest, narrow the field of candidates. You’re free to lend as much or as little weight to the myriad opinions on the who’s who in the social media sphere.

But at the end of the day, only you can decide who is right for your cause and your campaign. One factor you might consider could be how much time you can devote to engaging with these influencers, in addition to their degree of relevance.

5. Organize them

Once you’ve finalized your selection of social media influencers, group them by categories of your choice. You can create lists on Twitter and circles on Google+ to help manage your interactions.

You can publicize or keep private the lists/circles you make. If you opt to share your lists publicly, be careful how you display them; avoid giving the impression that you’re familiar with these people if you actually haven’t met them before.

6. Interact with them

Across the various platforms, start following your chosen influencers. Listen to what they’re saying. Respond to their thoughts. Start conversations. Ask questions.

This is a marathon, not a sprint

Remember that influencers are ultimately people to connect with, not instruments to be played. So offer ways to serve them; in that way, both your mission and your influencers benefit.

For more info, read our free online guide on social media influencers.

Sara Choe is an online fundraising expert at CauseVox, a crowdfunding platform for nonprofits and social good projects.

          Use Google Spreadsheet As Website's Time-Machine (Web Archive alternative)        
none
          3 Things You Didn't Know Google Spreadsheet Can Do For You (Part 1)        
none
          Get Notified When A Collaborator Made Some Changes On A Shared Google Spreadsheet        
none
          Digital Savvy by CompuScholar {Homeschool Review}        
CompuScholar, Inc.

We live in a day and age where everything is becoming computer based, and it looks like it will continue to be like this.  I have been slowly having my children learning basic computer skills, and even do some of their subjects completely on the computer.  No matter the job, computers will always be around.  So, in light of that, we've had fun reviewing the new Digital Savvy course for CompuScholar, Inc,  (formerly known as Homeschool Programming).

CompuScholar, Inc. Digital Savvy


Digital Savvy is for 6th-12th grades. You have the option to pay monthly or for the entire year ($120). The course has 25 chapters (which include several lessons each) and can be done over the course of a year.  I reckon you could even do it in less time depending on your child's productivity and interest.  The course site is well laid out and very user friendly.


This course covers fundamental computing topics and skills such as:
  • Hardware, software, and operating systems
  • Managing files and folders
  • Basic networking
  • Online safety and computer security
  • Using Word processors, spreadsheets, and presentation programs
  • Creating simple databases
  • Image editing
  • Using social media and email communications
  • Introductory website design
  • Simple computer programming concepts
  • Exploration of computing careers

The course requires you to have Windows or Mac operating systems.  To complete the student work, the systems must be Windows 7, Windows 8, Windows 10, or Mac OS 10.7 version or higher.
Your child will complete fun, hands-on projects in every lesson and a quiz.  Each lesson has a video to watch and a student text to read.  It's probably a good idea to watch the video and read through the student text, as the text sometimes has info not in the video.  

At the end of each chapter is a exam.  The program also records these exams.  The lessons can be done at your own pace.  My daughter who is in 9th grade did roughly 1 chapter each week.  It just depended on the rest of her work load for the week.  She enjoyed the course although it took her a little while to get into it and want to do it, but she's a teenage girl and texting her friends sometimes distracts her from schoolwork.  We're working on that.  Haha!

Overall, I feel this is a great course and much needed in our day and age.  Looking over the course there are a few lessons that I want to go in and read through.  I'm sure there is plenty in there for me to learn!  I learned all my computer skills on my own and wish I had learned through a course like this.  This is a nice easy course for your child to learn the basics of computer knowledge without making mistakes some of us oldies (or maybe it's just me?!) tend to make.






CompuScholar's Social Media:
Facebook
Twitter


CompuScholar also offers Web Design and Java Programming.
To read reviews for those products or more reviews for Digital Savvy, click below.
Digital Savvy, Web Design & Java Programming {CompuScholar,Inc Reviews}


Crew Disclaimer



          Google CFO retiring, leaves spreadsheets for backpack        
Google CFO retiring, leaves spreadsheets for backpack
          Comment on Use a Custom Script to Automatically Email the Submission Contents of a Google Form by Edo Plantinga        
Yes: var formResponsesURL = s.getParent().getUrl(); // the spreadsheet with the form responses Also see the script I just posted in another comment
          POR QUE LIMPIAR EL TANQUE DE ALMACENAMIENTO DE AGUA??        
para responder a la encuesta haga clik Aqui!!
          Translate your favorite with Google        
Sebelumnya telah diuraikan perihal Priority Inbox, dimana fitur ini bekerja pada Gmail dan saat ini Ambae.exe berkesempatan menggunakannya bersama others AMBAE. Lebih awal disampaikan bahwa Google mengeluarkan 2 (dua) fitur terbarunya. Yang akan diuraikan di bawah ini adalah Google Translate dengan fitur yang lebih hebat. Bagi sobat Googler, Google Translate bukanlah hal baru, tapi mungkin saja masih ada rekan-rekan yang ingin mengetahuinya dari awal terutama Ambae.exe yang masih awam.

Layanan ini diperkenalkan pada tahun 2007 dan mulai memasukkan Bahasa Indonesia ke dalam pilihan bahasa yang bisa diterjemahkan sejak 25 September 2008. Adapun sasaran utama dari Google Translate yakni untuk menerjemahkan bagian teks atau halaman web dalam satu bahasa ke bahasa lain.


Lalu mengapa diklaim memiliki fasilitas canggih dibanding penerjemah lainnya...? Tidak seperti layanan terjemahan lain seperti Babel Fish dan AOL yang menggunakan SYSTRAN, Google menggunakan perangkat lunak terjemahan sendiri. Itulah hebatnya mbah Google, si empunya tidak mau meminjam apapun dari pihak lain. Kalau perlu Sergey M. Brin dan Lawrence E. Page membeli produk pihak lain agar bisa menguasai dunia maya.

Lanjut fren, satu persatu kita bahas perihal Translate milik mbah Google. Dukungan Google Translate mencakup :
- Google Translate versi standar
- Google Search
- Google Toolbar
- Google Chrome
- Google Labs
- Google Mail (Gmail)
- Google Chat (chat via Gmail)
- Google Talk
- Google Groups
- Google Moderator
- Google Docs (Google Documents)
- Google Video (YouTube)
- Google Mobile
- Google Android
- Google Web Element
- Google AJAX API
- Google Toolkit

A. Google Translate versi standar

Steps A1
Kunjungi Google Translate, lalu ketik atau paste karakter, kata maupun kalimat yang ingin diterjemahkan. Tentukan bahasa terjemahan yang akan digunakan dan klik Translate. Tukar arah terjemahan dengan mengeklik tombol Swap languages.


Steps A2
Klik Listen untuk mendengarkan terjemahan versi suara Google. Klik Read phonetically untuk membedakan suara/lafadz dari yang diperdengarkan Google. Metode Read phonetically memanfaatkan sistem notasi fonetis (International Phonetic Alphabet), yakni kumpulan tanda yang dengannya semua bahasa manusia dapat ditulis dan dijelaskan. Seperti halnya membedakan lafadz fa dengan pa dan va atau sa dengan xa dan za atau ca dengan ka dan qa serta ei dengan ee dan ey dan bahasa planet lainnya.


B. Google Search

Steps B1
Silakan meluncur ke Google Homepage dan ketikkan keyword yang akan dicari. Hasil pencarian pun muncul sebanyak 10 URL/Website di halaman pertama Google yang paling terkait dengan keyword tersebut. Untuk menerjemahkan URL, klik link Translate this page atau Terjemahkan laman ini di sisi kanan salah satu URL hasil Searching. Translate dapat dimaksimalkan dengan mengatur metode pencarian pada menu Advanced Search atau Preferences.


Steps B2
Setelah mengeklik link Translate this page, Google segera menerjemahkannya. Tentukan bahasa terjemahan pada tampilan halaman berikutnya. Untuk menukar arah terjemahan, tidak disiapkan tombol Swap languages. Sehingga harus mengganti bahasa secara manual pada kedua drop down menu.


Steps B3
Temukan hal unik dari namamu, pada kolom pencarian ketik perintah Translate 'My Name is xxxxxxx' to yyyyyyy.
~ xxxxxxx : Nama
~ yyyyyyy : Bahasa


C. Google Toolbar

Steps C1
Google Toolbar bekerja pada browser Microsoft Internet Explorer dan Mozilla Firefox versi 2.0 atau yang lebih tinggi. Operation System (OS) yang direkomendasikan yakni Windows XP SP2, Windows Vista atau yang lebih tinggi. Untuk mengaktifkannya Download dan Install Google Toolbar, ikuti langkah selanjutnya hingga Finish. Bila Google memerintahkan untuk Restart, sobat Googler ikuti saja, jangan membantah perintah orang tua (mbah Google), mumpung GRATIS. Setelah terinstall, Google Toolbar akan terpasang di bagian atas halaman browser, tepatnya pada Toolbar Menu.



D. Google Chrome

Steps D1
Serupa tapi tidak sama, Google Chrome mirip dengan Google Toolbar. Perbedaannya, Google Toolbar merupakan Add-ons yang bekerja pada browser IE dan Mozilla Firefox. Sedangkan Google Chrome merupakan browser yang di dalamnya telah dibenamkan fitur Google Translate dengan tampilan Google Toolbar. Bekerja hanya pada Operation System (OS) Windows XP SP2, Windows Vista dan Windows 7. Daripada bingung, Download versi teranyar Google Chrome dan rasakan kehangatan Google Translate bersamanya. Untuk versi Beta, disini downloadnya.


E.
Google Labs dan Google Mail (Gmail)

Steps E1
Email yang masuk tidak selalu menggunakan bahasa Indonesia. Bahkan orang Indonesia pun terkadang memakai bahasa Alien dalam mengirimkan email atau di saat chatting. Install salah satu fitur Google Labs pada Menu Settings. Pilih Sub Menu Labs, aktifkan fitur Message translation lalu Simpan pengaturan.


Steps E2
Masih pada Menu Settings, pilih Sub Menu General dan tentukan bahasa terjemahan secara default lalu simpan kembali pengaturan terakhir.


Untuk menonaktifkan fitur ini, centang Disabled pada Menu Labs. Namun, bila ingin menonaktifkan Google Labs secara utuh tanpa memilah-milah fiturnya, kunjungi OFF.

F.
Google Chat (chat via Gmail)
dan Google Talk

Steps F1
Fitur ini memanfaatkan Google Bots dalam menerjemahkan setiap pesan Chat. Googler yang terbiasa dengan Google Wave, kiranya tidak mengalami kesulitan dalam mengaplikasikan Google Bots. Guna memudahkan sobat Googler, Ambae.exe menyuguhkan daftar Google Bots yang dapat ditambahkn pada Gmail Chat atau Google Talk.




Google Bots pada fitur ini dimaksudkan sebagai robot Google yang akan menelusuri data yang dicari para Googler. Selanjutnya para Robot ini menerjemahkannya ke dalam bahasa masing-masing Bots. Jangan lupa menambahkan @bot.talk.google.com. Format penulisan Bots yakni xxxxxxx@bot.talk.google.com.
~ xxxxxxx : Salah satu Bots
~ Contoh zh2en@bot.talk.google.com : Google Bots untuk bahasa Chinese ke English.

G.
Google Groups, Google Moderator
dan Google Docs (Google Documents)

Steps G1
Berdiskusi bersama rekan sesama Googler, mengedit dokumen, merespon, bertanya, menjawab, menyarankan atau memberi ide dan masukan kepada dan/atau dari Google. Dengan fitur Google Translate di dalamnya, serasa kuliah dengan fasilitas Dwi Bahasa. Sobat tidak perlu mengutak-atik Setting dan Preferences. Cukup mengunjungi Translate ala Google Groups, Google Moderator dan/atau Google Docs (Google Documents).

H. Google Video (YouTube)

Steps H1
Halaman pencarian YouTube, mirip dengan Google Homepage. Klik Translate untuk menerjemahkan Keterangan Video atau klik Show original untuk mengembalikan ke dalam bentuk aslinya.


Steps H2
Sobat Googler sering menikmati tayangan HBO atau HBO Asia, Cinemax, Star Movie, Star World dan sebagainya...??? Menonton Film kesayangan di channel termasyur tersebut benar-benar mengasyikkan. Tersedia dalam Bi-Lingual atau Dwi Language. Di samping itu tersedia pula Running Text di setiap Film, meski film itu telah berganti bahasa dari English ke Indonesia. Wajarlah HBO mengklaim dirinya It's not TV, it's HBO

Bagaimana dengan Google Video atau YouTube...??? Video yang disertai dengan keterangan Teks alias Caption dapat pula diterjemahkan ke dalam berbagai bahasa. Dengan catatan bahwa video tersebut memiliki Caption atau lazim dikenal dengan Running Text sebagai terjemahan teks dari apa yang diucapkan dalam video bersangkutan. Klik Icon (^) lalu pilih Icon (cc), keduanya berada di pojok kanan bawah Video. Lalu klik Translate Captions untuk mengatur bahasa terjemahan.


Steps H3
Tentukan bahasa terjemahan dengan mengeklik drop down menu. Selanjutnya klik tombol Translate. Maka sekejap, Caption Video diterjemahkan sesuai bahasa yang dipilih.


I. Google Mobile dan Google Android

Steps I1
Translate via Google Mobile melalui perangkat seluler sobat Googler dengan mengunjungi Google Mobile

Steps I2
Dengan Google Android, fitur Google Translate benar-benar dimaksimalkan. Tidak hanya berupa teks ke teks, Googler dapat menerjemahkan teks ke dalam bentuk suara. Demikian sebaliknya, suara diterjemahkan kembali ke dalam bentuk teks. Download Google Android untuk merasakannya.

J. Google Web Element,
Google AJAX API dan Google Toolkit

Steps J1
Ingin memiliki Tool Translater di Webpage sobat Googler...??? Benamkan Code Google Translate ke dalamnya.

Copy dan Paste Script Code berikut

Script di atas hanya contoh, pengaturan secara manual silakan kunjungi Google Web Element. Mengenai penempatan HTML tersebut, posisikan pada BODY section.

Sobat tidak menginginkan Google menerjemahkan halaman Webpagenya. Copy dan Paste Meta Tag berikut dan posisikan pada HEAD section :


Steps J2
Terjemahan Webpage maupun Web Element lebih dimungkinkan lagi dengan fitur Google AJAX API. Tentunya Googler akan lebih sering bermain dengan Javascript.

Steps J3
Yang ini lebih hebat lagi. Googler dapat mengupload dan mendownload data yang diterjemahkan dengan fitur Google Toolkit. Lebih lengkapnya, kunjungi Google Tranlator Toolkit karena pada halaman tersebut diuraikan dengan jelas melalui media Video Tutorial versi mbah Google.

Sobat Blogger dan rekan Googler yang ingin mengetahui lebih banyak tentang fitur ini, silakan mengunjungi mbah Google. Mengingat panjangnya artikel ini dan rumitnya pembahasannya. Sebagai newbie, mohon dikoreksi bila terdapat kesalahan di dalamnya.

Persembahan berikutnya berupa Daftar Bahasa yang didukung Google Translate hingga saat ini sebanyak 57 Bahasa, yakni :
* Afrikaans
* Albanian
* Armenian (Alpha)
* Azerbaijani (Alpha)
* Arabic
* Basque (Alpha)
* Belarusian
* Bulgarian
* Catalan
* Chinese (Simplified)
* Chinese (Traditional)
* Croatian
* Czech
* Danish
* Dutch
* English
* Estonian
* Filipino
* Finnish
* French
* Galician
* Georgian (Alpha)
* German
* Greek
* Haitian Creole (Alpha)
* Hebrew
* Hindi
* Hungarian
* Icelandic
* Indonesian
* Italian
* Irish
* Japanese
* Korean
* Latvian
* Lithuanian
* Macedonian
* Malay
* Maltese
* Norwegian
* Persian
* Polish
* Portuguese
* Romanian
* Russian
* Serbian
* Slovak
* Slovenian
* Spanish
* Swahili
* Swedish
* Thai
* Turkish
* Ukrainian
* Urdu (Alpha)
* Vietnamese
* Welsh
* Yiddish

          Free calls with GOOGLE        
Menelpon makin tren sejak beberapa tahun silam dengan hadirnya ponsel tanpa kabel. Ponsel merambah hingga pelosok pedesaan. Bahkan bagi para pendaki gunung atau lazim disebut pecinta alam, pun dapat menikmati kecanggihan teknologi cellular. Telepon memiliki dua varian secara umum, berupa Telepon Kabel dan Telepon Tanpa Kabel (Mobile Phone). Varian telepon kabel antara lain telepon rumah dan telepon genggam versi rumahan (CDMA Flexi). Varian mobile phone/telepon genggam antara lain telepon genggam tipe GSM, telepon genggam tipe CDMA, telepon satelit, telepon pintar atau smartphone, telepon genggam versi iPhone dan telepon genggam versi BlackBerry.

Di samping kedua varian utama di atas, kini dikenal pula Telepon Internet. Jauh sebelum Google meluncurkan Google Voice dan Google Phone. Yahoo! telah lebih awal mengaplikasikan layanan telepon berbasis layanan internet melalui Yahoo! Messenger yang diklaim teranyar hingga saat ini. Pada postingan kali ini, akan diuraikan sedikit perihal menelepon ala GOOGLE.


Layanan yang terbilang baru dari Google yakni Google Phone. Google Phone dapat dinikmati para Googler dengan memiiki Google Account. Selanjutnya Google Account tersebut diaplikasikan pada layanan Gmail yang didalamnya include Google Voice dan Google Phone. Baiklah sobat Blogger, kita langsung menuju TKP mengingat artikel terkait Google Phone telah dibahas lebih luas oleh rekan Blogger lainnya.

Steps 1
Berbekal Google Account, silakan Login via Google dengan mengunjungi Gmail.


Steps 2
Halaman Inbox Gmail akan segera muncul setelah Login sukses. Layanan Google Phone belum sepenuhnya dapat dirasakan para Googler. Setelah Login, sebagian Googler harus menunggu Google datang berkunjung ke Accont Gmail yang dimiliki. Prinsip ini menjadi kebiasaan Google dalam menerapkan layanannya kepada penggunanya. Seperti halnya ketika Google Wave pertama kali diluncurkan, hanya Google TERUNDANG yang dapat menikmatinya. Sementara yang lain mesti menunggu UNDANGAN GOOGLE atau mendapat undangan dari para WAVER. Sembari menunggu Undangan, jangan lupa mengaktifkan Google Voice.


Steps 3
Bagi Googler yang beruntung, pada tampilan Gmail Inbox akan muncul sebuah Pop up Menu bertitelkan New! Make Phone calls from Gmail. Klik Try it Now untuk memulai.


Steps 4
Selanjutnya klik Accept untuk menyetujui kebijakan yang diberlakukan GOOGLE.


Steps 5
Pop up Google Phone akan terbuka di pojok kanan halaman Gmail, Googler pun siap melakukan panggilan. Saat ini Google memberikan layanan Gratis untuk panggilan keluar dan panggilan masuk pada negara Amerika dan Canada masing-masing selama 2,1 menit senilai $0,10. Bagi sobat PHONER (lagi-lagi istilah bagi pengguna Google Phone) yang telah menimatinya lebih awal dan ingin mengaktifkan kembali layanan ini, klik Menu Call phone di bagian kanan halaman Gmail. Maka Pop up Google Phone muncul seketika. Untuk melakukan panggilan, tersedia sub menu Dialpad yang terdiri dari tombol angka/huruf/simbol, tombol Clear/Delete/Cancel, kolom isian panggilan atau pencarian, tombol negara tujuan panggilan dan tombol Call.


Steps 6
Nomor panggilan dapat diakses dengan mengeklik tombol angka. Di samping itu, dapat pula dengan mempaste angka yang telah dicopy dari tempat lainnya.


Steps 7
Pada Menu Google Phone selain sub menu Dialpad, memuat pula sub menu Calling History dan Calling credit. Calling History memuat daftar/Log panggilan masuk dan panggilan keluar.


Sedangkan Call credit terdiri dari :

a. History

Sama halnya dengan sub menu Calling history, namun kali ini sebuah Tab baru akan terbuka yang memuat daftar lengkap panggilan. Kelola Kontak pada halaman ini dengan menambahkannya atau pun menghapusnya dari Daftar Kontak.


b. Rates

Pada halaman ini, Phoner disuguhkan pada sederet daftar biaya yang dikenakan pada tiap panggilan untuk masing-masing negara tujuan dan provider cellular. Secara keseluruhan tarif pemakaian Google Phone masih dalam Kurs Dollar termasuk Phoner Indonesia.


Berikut Daftar Tarif Google Phone (Dollar mode ON) :





c. Add credit

Account Google Phone dapat diatur pada halaman ini, khususnya pengaturan Bahasa dan Zona Waktu yang diberlakukan. Penambahan kredit senilai $10.00 dimungkinkan dengan mengeklik Add $10 credit.


Steps 8
Tombol lainnya yang cukup jitu antara lain :
- Tombol Redial untuk panggilan ulang
- Tombol End untuk mengakhiri
- Tombol Answer untuk menjawab panggilan masuk
- Tombol Screen untuk menahan/pause panggilan masuk
- Tombol Ignore untuk mengabaikan panggilan masuk
- Tombol/Icon Speaker untuk mengelola suara pada Microfon
- Tombol Dialpad

Tombol Dialpad pada mode panggil berfungsi untuk me-MINIMIZE dan me-MAXIMIZE Tampilan Google Phone. Tombol ini berbeda dengan Tools Minimize, Pop-out dan Close di pojok kanan atas Pop up Google Phone.


Steps 9
Pada tampilan Keypad yang digunakan Google Phone, selain angka juga tercantum alfabet/abjad/huruf dan simbol. Hanya saja, penggunaan huruf dengan fasilitas klik belum memungkinkan, kecuali pada mode PASTE on Search/Dial Column.


Penggunaan abjad saat ini masih belum support, percobaan demi percobaan dilakukan dari keseluruhan layanan ini. Termasuk untuk penggunaan abjad. Dapat dibayangkan bahwa tombol abjad ini kiranya dapat digunakan untuk layanan SMS. Namun, alhasil ERROR dan seketika muncul pesan Please enter a valid number. Rupanya SMS belum memungkinkan, kita tunggu saja inovasi berikutnya dalam pengembangan Google Phone di masa mendatang.

Steps 10
Sebagai tambahan, khususnya bagi sobat Googler yang belum mengaktifkan layanan Google Voice, sebelum menggunakan layanan Google Phone. Penting untuk mengaktifkan terlebih dahulu Google Voice. Sedangkan Google Voice itu sendiri akan aktif bilamana telah terinstal Plugin Google Voice and Video.


Setelah semua langkah di atas komplit, maka sobat Phoner dapat merasakan nikmatnya bercuap-cuap ria via Google Phone.

          Betsy Bird's Top Ten List of Home Improvement Principles YOU Should Know        
Oh faithful readers ... how I have betrayed you! It's been seven days since my last post. It's not that I haven't been thinking of you, or itching to post. It's that between a four-day visit from my parents and an overnight trip out of town and three different kid performances to attend and a job and the small matter of a major construction project going on in my backyard, I haven't had time to look for my laptop, much less use it. Please don't feel bad. My laundry has been equally neglected.

But I'm back, and ready to aim for a post-per-day in May. Toward that end, tonight I offer some home improvement advice.

A bit of background is in order. It's a long and tragic story, with many chapters and subplots and a red herring or two. But here is a synopsis.

Long ago and far away, a builder constructed a house on the last undeveloped lot on a Southern suburban street. Neighborhood children had been happily playing on that lot for years because it was so much fun to roll down the ravine at the back of it toward the creek.

The builder sold the house to a family.

Ten years ago, that family sold the house to us. As soon as we left the closing, the family began to laugh.

Cut to five years ago. We began to notice the back corner of our kitchen was sloping.

We ignored it. By a couple of years later, we couldn't anymore. Things around our house -- or shall I say, under our house --were going downhill. Literally.

One pie-in-the-sky architect, five different contractors, one greedy underhanded overpaid engineer, an ocean full of gravel and cement, and a couple of years later, our house is stabilized. Now that that's done, we're finally replacing our deck and screen porch, so that we can look out upon the ravine into which we have poured money equal to the cost of a complete college education for one of our children.

I've learned many things on this journey. (Ever notice how everything today is a journey? American Idol, Top Chef, Project Runway, Dancing With the Stars, running for president -- apparently we've all been traveling and we didn't even know it.) Tonight I'd like to share some of them with you.

Betsy Bird's Top Ten Things Every Home Improver Should Know

10) All home improvement projects involve the modification or removal of a previous home improvement project that didn't improve anything. This is particularly true if the previous home improvement project was undertaken by the husband of the woman living in said home. Tip: Should your spouse announce plans to replace your gutters himself, immediately seek a restraining order.

9) Home improvement is actually a euphemism. The real name of what you are doing is Making One Decision After Another. I can make pie crust, curtains, and spreadsheets, but I have a lot more trouble with decisions. If you do not like to make decisions, you should learn to love the house you already have.

8) Here's a handy formula for determining the cost of the additional furniture you'll need for your improved space. For each additional 10 square foot of living space, count on spending more than you can possibly afford for the whole room.

7) Human life expectancy may increase with each passing decade, but today's appliances live on borrowed time. The dishwasher you buy to replace your 20-year-old one will develop a terminal illness before it reaches kindergarten age.

6) HGTV is not reality television. It is a fictional dramatic series written and acted by the same people who brought you imaginary friends.

5) Porta-potties can blow over in heavy winds.

4) When it comes to electric sockets, three things matter: location, location, location.

3) You know that pest control guy you've been paying to come to your house each month and spray for termites? That's all he's been doing -- spraying. You mean you wanted him to actually prevent termites? Geez ... you should have said so. That costs extra.

2) Laura Ingalls Wilder lived in a little house on the prairie. Surely we can all live in houses without granite countertops.

And my Number One suggestion about improving your home ...

Don't.

          AAVSO Bulletin 77 for 2014        

Predicted Dates of Maxima and Minima of 381 Long Period Variables for 2014

AAVSO Bulletin 77 - Predicted Dates of Maxima and Minima of Long Period Variables for 2014 - uses the same configurable format that was introduced with Bulletin 74 in 2011. Most of the information that was included in earlier numbers of the Bulletin is included in this new format, along with more information and links to external resources. Please read all of the materials provided through this page to familiarize yourself with the new, user-definable format of the Bulletin and the ways you can use it in your variable star observing and research. If you have questions, please contact AAVSO Headquarters.

 

 

Fixed Versions:

Bulletin 77 as a single PDF file (for printing)
Bulletin 77 as a single CSV file (for spreadsheets)
Bulletin 77 dates for the double-maxima stars V BOO, R CEN, and R NOR

Bulletin 77 in order of need: stars with fewest number of observations (N(Obs)) through greatest number (does not include V Boo, R Cen, R Nor)

 

Live Version: The Bulletin Generator

Click here to generate a customized AAVSO Bulletin to suit your own needs.

Note: The double-maxima stars V BOO, R CEN, and R NOR are NOT included in this Bulletin Generator; click here for Bulletin 77 dates for these stars.


          AAVSO Bulletin 75 for 2012        

Predicted Dates of Maxima and Minima of 381 Long Period Variables for 2012

AAVSO Bulletin 75 - Predicted Dates of Maxima and Minima of Long Period Variables for 2012 - uses the same configurable format that was introduced with Bulletin 74 in 2011. Most of the information that was included in earlier numbers of the Bulletin is included in this new format, along with more information and links to external resources. Please read all of the materials provided through this page to familiarize yourself with the new, user-definable format of the Bulletin and the ways you can use it in your variable star observing and research. If you have questions, please contact AAVSO Headquarters.

 

 

Fixed Versions:

 

Live Version: The Bulletin Generator

Click here to generate a customized AAVSO Bulletin to suit your own needs.

Note: The double-maxima stars V BOO, R CEN, and R NOR are NOT included in this Bulletin Generator; click here for Bulletin 75 dates for these stars.


          Affiliate Program Internet Marketing Money and Employement        
Here's a scenario that may sound familiar: an entrepreneur finds a great home business opportunity and works hard setting up an online business. He makes a good income for the first few months, but struggles after he's sold his primary product to everyone he knows. He has three choices: to create new products, build his email list, or launch an affiliate program. Which plan do you think will result in the most future income?





Would it surprise you to know that building an affiliate program is the quickest way to build online income? Here's why - even the most lucrative home business opportunity is limited by your ability to sell a product or provide a service, unless you create passive income through affiliate sales.





To illustrate the truth of that claim, let's look at how affiliate sales multiply business. In the beginning, you might select a product or service you can market in order to provide money and employment for yourself. You market that product or service online and are able to make a respectable living. You didn't, however, take advantage of a home business opportunity to earn a respectable income. If you're like most entrepreneurs, you want to make it big, strike it rich and earn an income you could only dream about while traditionally employed. That's why you start searching for ways to expand your online empire. You notice a lot of information about affiliate programs, and decide to investigate. Here's what you learn:





Affiliate Fact #1: Affiliates are people who sell your product or service to their clients, and receive a commission for doing so.





Affiliate Fact #2: Affiliate income is passive income for you, meaning that it requires little effort on your part but still results in income.





Affiliate Fact #3: Online affiliate programs are simple to set up and administer.





Affiliate Fact #4: If you've purchased a packaged business opportunity, an affiliate program may already be built in to the structure.





Being a savvy entrepreneur, you consider these facts and decide to start an affiliate program. Where's the best place to advertise the opportunity to earn commissions for selling your products? To satisfied customers! You may be able to build all the affiliate program you can manage simply by telling your client list what you're doing.





The next stage is setting up a structure to manage that program. There are three basic components: marketing materials that affiliates can use to sell your products (email message series, website banners and ads), a way to track affiliate sales as they occur, and a system for paying commissions. As we mentioned in Affiliate Fact #4, some packaged business opportunities already have all the affiliate tools you need.





Otherwise, create marketing templates and trackable links to your sales page that identify the referring affiliate. Set up a spreadsheet or track sales through your shopping cart using the affiliate links. And finally, make sure you pay your affiliate commissions quickly. After all, a satisfied affiliate is more likely to sell more products, and that creates more income for you both.





With this information about affiliate programs, you should be able to see the possibilities presented by adding affiliates. With each new affiliate, your exposure is enhanced exponentially, with no new marketing on your part. No marketing, easy administration and a potential for unlimited income...don't those sound like good reasons to start recruiting affiliates today?


Chris Robertson is an author of Majon International, one of the worlds MOST popular internet marketing companies.
For tips/information, click here: Money and Employment
Visit Majon's Business and Entrepreneurs directory.

affiliate program internet marketing: affiliate program

affiliate program internet marketing: affiliate program

Article Source: www.articlesnatch.com


          today's leftovers        
  • Google Grabs Nielsen as Business Apps User From Microsoft

    For word processing and spreadsheets, Nielsen staff now uses Google Docs and Sheets instead of Microsoft’s Word and Excel applications from its familiar Office suite of software. For video conferencing and messaging, Nielsen dropped Microsoft’s Skype in favor of Google equivalents.

  • 3DR Solo Back as Open Source Platform

    Don’t play Taps for 3DR‘s Solo yet. 3DR’s CEO Chris Anderson tweeted today that the Solo is getting a second life.

    In an article title “The Solo Lives On,” on the ArduPilot Blog – ArduPilot is an opensource autopilot system – the team explains how a community of developers worked to give the Solo a “heart transplant.” The developer of the now-obselete Pixhawk 2.0 hardware flight system, the Solo’s stock system, has developed a bolt-on replacement which will allow for new ArduCopter firmware changes.

  • Bluetooth Mesh networks: Is a standards body right for IoT innovation?

        

    Mesh networks are not new. It is a network topology in which each node relays data for the network. All mesh nodes cooperate in the distribution of data in the network. The IoT-purpose-built Zigbee—a low-power, low-bandwidth ad hoc network—is a mesh network. Dating to 2002, Aruba Networks was founded to build Wi-Fi mesh networks. In 2014, student protesters in Hong Kong used mobile app FireChat to turn the crowd’s smartphones into a Wi-Fi and Bluetooth mesh network so authorities could not interrupt protester’s coordinating conversations by blocking 3G and 4G network access.


          Mathematics for Chemistry        
Mathematics for ChemistryThis interactive electronic textbook in the form of Maple worksheets comprises two parts.

Part I, mathematics for chemistry, is supposed to cover all mathematics that an instructor of chemistry might hope and expect that his students would learn, understand and be able to apply as a result of sufficient courses typically, but not exclusively, presented in departments of mathematics. Its nine chapters include (0) a summary and illustration of useful Maple commands, (1) arithmetic, algebra and elementary functions, (2) plotting, descriptive geometry, trigonometry, series, complex functions, (3) differential calculus of one variable, (4) integral calculus of one variable, (5) multivariate calculus, (6) linear algebra including matrix, vector, eigenvector, vector calculus, tensor, spreadsheet, (7) differential and integral equations, and (8) probability, distribution, treatment of laboratory data, linear and non-linear regression and optimization.

Part II presents mathematical topics typically taught within chemistry courses, including (9) chemical equilibrium, (10) group theory, (11) graph theory, (12a) introduction to quantum mechanics and quantum chemistry, (14) applications of Fourier transforms in chemistry including electron diffraction, x-ray diffraction, microwave spectra, infrared and Raman spectra and nuclear-magnetic-resonance spectra, and (18) dielectric and magnetic properties of chemical matter.

Other chapters are in preparation and will be released in due course.
          Pax Stellarum 2.0 - Released        
I've finally finished the revision of Pax Stellarum version 2.0. This has been a major endeavor, which took me the best part of 2016 to complete, but I'm definitely satisfied with this version of the rules.

For those of you readers of this blog that don't know of, Pax Stellarum is my own homebrew set of rules to play spaceship wargames. 

Starships are one of my greatest passions in this hobby, but I've never found a set of rules I was completely happy with. Every ruleset out there had some elements I liked, but lacked others. Additionally, none of them felt universal enough to allow me to play on several different scifi settings (Star Trek, Babylon 5, Star Wars, Battlestar Gallactica, Stargate, etc.), and having to learn a number of different rulesets in order to play each universe with a different one was out of question for me.

Because of that, I started working on my own rules a few years ago, came up with a first version of the system, which felt good enough, and since then I've been taking notes of things I could improve, which culminates with this thoroughly revised version we have here.

You can find a link to the Pax Stellarum GoogleDrive files on the left column of this blog, half way down.




From the start, the guiding directives behind Pax Stellarum were clearly set in my head: 

 - To confortly accomodate about a dozen or more ships per side
 - Fast gameplay
 - No book-keeping required (few things to keep track of, and even this could be done with tokens on the table)
 - Universal rules to play any scifi setting, including a broad range of weapon and special abilities options
 - Design Rules with a point system, so that I could create stats for all ships I'd like to play with
 - Options of actions in play (beyond just move ahead and shoot)

These objectives have always guided my efforts and tweaking of the rules. Still, what I usually lack is the time to playtest all the inovations I had in mind to better address those topics. This has been duly dealt with this last december, though. I had several weeks of vacation and used most of my time to test over and over again the rules I'd like to add/change, and see what worked and what didn't.

This version of the rules presents the result of this recent playtesting. For those already acquainted with the previous version of the rules, here are the fundamental changes you'll find in this edition:

- New Shields mechanics
- New Fighter and torpedo mechanics
- Detailed Rules for several different types of terrain
- Additional weapon and ship traits, allowing further customization


A brief introduction to PAX STELLARUM

A fundamental aspect of the game is Technology. Players must select a Tech level for their ships (usually the same for their entire fleet). This will determine what systems are available for it in the design spreadsheet.




As you can see, a number of examples are provided for each Tech level, to help players better categorize their own chosen faction in order to design their ships. These examples are obviously suggestions, and players are free to choose the tech level they feel better represents their race of choice. 
Tech determines everything on a spaceship. How much it can purchase on Engines Rating, how powerful their weapons are, what traits those weapons may have, what special systems the ship may install, how powerful are its Shields, Sensors, etc.

Everything is costed in points (including Tech!), and a .xls spreadsheet has been develop to do all the math automatically. You just need to select what options you'd like to add, and the spreadsheet will calculate cost/mass requisite and tell you if any of your selections violate a design rule (usually related to tech requisites).

The design spreadsheet also has a number of ready-to-use ship stat displays, from Star Trek, Star Wars and Babylon 5, so players willing to give the game a try may do so without needing to first go through the design rules. Over the months, I'll be adding more and more ship designs of my own, and will be uploading the new ones here on the blog as I go.




In terms of gameplay, players can expect to find in Pax a ruleset aimed at larger engagements, as I mentioned, but that still does well on small skirmishing actions. The game turn is divided in a number of phases, with both players acting on each of them, in alternated activations:

- Initiative Phase
- Movement Phase
- Shooting Phase
- Ordnance Phase
- End Phase

Initiative determines who gets to choose which player acts first on each phase. Usually, players will prefer to let the opponent start activating on the movement phase, and start themselves activating on the shooting phase. The number and level of Command ships on each fleet has a major impact on the rolls to determine initiave each turn.

Movement is based on a ship's chosen level of thrust (adrift, low or High), and this can only be changed at the end of a ship's activating, which means it'll only impact next turn's movement. Thus, players are required to plan in advance how they intend to move on the following turn.
During a ship's activation on this Phase, it also has the opportunity of trying to take on a special order. There are 11 Special Orders available, and they involve pushing your engines harder to move farther, diverting power to shields, improving you accuracy by locking weapons on a target, etc.

Shooting involves the use of D10, and the concept of Quality. This is a ship stat that determines not only how well a ship shoots, but also how likely it is to pass a command check to take on a special order, among other things. 

The ordnance phase is where fighters and torpedoes move and attack, and this has been one of the areas where I've implemented the greatest number of changes, aiming at simpliying and speeding up game play.

On the end phase, players get the chance to repair critical damage, as well as restore their Shields Rating a bit. Additionally, at this stage, a fleet that has already lost half their total of hull points need to take a Morale Check to remain in battle.

Failing a fleet morale check is the standard condition of ending a game. That means that the fleet will leave the battlefield, with the opposite fleet claiming the victory. However, rules for calculating victory points are also provided, so that players may use them in their own scenarios, if they so will.

The rulebook is divided in two main chapters: Basic and Advanced Rules. I encourage any of you readers that enjoy a good ol' space battle to take a read at the basic rules, have a game or two with the ship designs provided in the design spreadsheet and see if this is a ruleset for you.

Pax Stellarum is and will always be a ruleset free-of-charge, it has been developed to suit my own taste for space wargames, but I'm glad to share it with any fellow gamer, as I know that, like myself, there are a lot of other ruleset-enthusiasts out there.

As always, feedback is deeply appreciated, since this is a system at constant evolution, and we could say that the release of version 2.0 coincides with the beginning of the work on version 3.0 (!)





          Lowongan Kerja Staff Promotion        
FemaleFresh GraduateDiploma or Degree in Management or EquivalentFluency in Indonesia amp English Written amp OralAbility with standard Features of Various Personal Computer word processing and spreadsheet software Microsoft Word and Excel Power Point internet and ...

          Please advise the Singularity Institute with your domain-specific expertise!        
Submitted by lukeprog • 18 votes • 33 comments

The Singularity Institute would benefit from having a team of domain-specific advisors on hand. If you'd like to help the Singularity Institute pursue its mission more efficiently, please sign up to be a Singularity Institute advisor!

If you sign up, we will occasionally ask you questions. You may be able to answer our questions or at least point us in the right direction. We may also request to schedule a quick chat with you.

Domains of expertise we especially need:

  • Nearly all subfields of economics, maths, computer science & AI, statistics, cognitive science, physics, biology, and naturalistic philosophy.
  • U.S. & California law
  • Non-profit development and tax compliance
  • Marketing & social media
  • U.S. government and military processes (lobbying, security, infrastructure, etc.)
  • Computer security
  • Large-scale logistics
  • Event planning
  • Executive coaching/training
  • Motivational speaking, social skills training
  • Publishing
  • Running workshops and meetups
  • Nuclear security, bio-security, disease control

33 comments
          Comment on Free Yourself from Spreadsheet Overload with QuickBase by Jesse Heller        
Awesome post Eddie!
          Heartland Institute        

The Heartland Institute

Background

The Heartland Institute is a Chicago-based free market think tank and 501(c)(3) charity that has been at the forefront of denying the scientific evidence for man-made climate change. The Heartland Institute has received at least $676,500 from ExxonMobil since 1998 but no longer discloses its funding sources. The Union of Concerned Scientists found (PDF) that “Nearly 40% of the total funds that the Heartland Institute has received from ExxonMobil since 1998 were specifically designated for climate change projects.” [1]

David Padden founded The Heartland Institute in 1984 and served as its Chairman between 1984 and 1995, co-chairing with Joseph Bast. Padden was also one of the original members of the Board of Directors of the Cato Institute. Padden, a Chicago, IL-based investment banker and then owner of Padden & Company, passed away in October 2011. [13]

Padden also served on the original Board of Directors of another organization founded that year, Citizens for a Sound Economy, which later split into two groups, FreedomWorks and Americans for Prosperity (AFP). The Cato Institute and both of these organizations received their initial seed money from Koch Industries. [14]

According to a July 2011 Nature editorial,

“Despite criticizing climate scientists for being overconfident about their data, models and theories, the Heartland Institute proclaims a conspicuous confidence in single studies and grand interpretations… . makes many bold assertions that are often questionable or misleading. … Many climate skeptics seem to review scientific data and studies not as scientists but as attorneys, magnifying doubts and treating incomplete explanations as falsehoods rather than signs of progress towards the truth. … The Heartland Institute and its ilk are not trying to build a theory of anything. They have set the bar much lower, and are happy muddying the waters.” [15]

2012 Heartland Document Leak

In 2012, leaked documents revealed some of the Heartland Institute's initiatives and climate change strategy including a tailored high school curriculum. As reported at the New York Times, (“Leak Offers Glimpse of Campaign Against Climate Science”) the Heartland Institute would have help from the Charles G. Koch Foundation to “cast doubt on the scientific finding that fossil fuel emissions endanger the long-term welfare of the planet.”  

The documents also discussed “Operation Angry Badger,” which the New York Times described as “a plan to spend $612,000 to influence the outcome of recall elections and related fights … in Wisconsin over the role of public-sector unions.” [2], [3]

Heartland has promoted itself using a partial quote from The Economist that describes Heartland as “the world's most prominent think-tank promoting scepticism about man-made climate change.” However, the full paragraph in The Economist's 2012 article provides a more complete picture: “The Heartland Institute, the world's most prominent think-tank promoting scepticism about man-made climate change, is getting a lot of heat.”

Heartland lost an estimated $825,000 in expected donations, a number of directors and almost its entire branch in Washington, DC shortly after putting up a billboard comparing those who believed in man-made global warming to the Unabomber Ted Kaczynski. At its annual meeting in Chicago, the institute's president, Joseph Bast, said Heartland had 'discovered who our real friends are.' The 100-odd guests who failed to show up for the '7th Climate Conference' were not among them.” [4], [5]

Heartland Institute and Tobacco

In the 1990s, the Heartland Institute worked with the tobacco company Philip Morris to question the science linking second-hand smoke to health risks, and lobbied against government public health reforms. Heartland continues to maintain a “Smoker's Lounge” section of their website which brings together their policy studies, Op-Eds, essays, and other documents that purport to “[cut] through the propaganda and exaggeration of anti-smoking groups.” [6]

In a 1998 op-ed, former Heartland president Joe Bast claimed that “moderate” smoking doesn't raise lung cancer risks, and that there were  “few, if any, adverse health effects” associated with smoking. In a fundraising letter to Phillip Morris, Bast wrote to a Phillip Morris executive  that “Heartland does many things that benefit Philip Morris’s bottom line, things that no other organization does.” Later, in 2014 Bast denied that he had claimed cigarettes were not harmful, until confronted with his own op-ed. [7], [182], [8]

Roy Marden, past Corporate Affairs Policy Analyst and Manager of Industry Affairs at Philip Morris, served as a board member at the Heartland Institute from 1996 until 2008. According to Heartland, “The public health community's campaign to demonize smokers and all forms of tobacco is based on junk science.” Joseph Bast, current President and CEO, was a strong defender of RJ Reynolds brand Camel's “Joe Camel” campaign, which some have argued (here, and here, for example)  targeted younger children. [9], [10], [11], [12]

Incoming President Tim Huelskamp

In June, 2017, The Heartland Institute announced Kansas Rep. Tim Huelskamp would be replacing Joe Bast as president, to begin working starting July, 2017. Bast said he would remain with Heartland as CEO until some time in 2018. [182]

Huelskamp is former chairman of the Tea Party Caucus and a member of the far-right House Freedom Caucus. Huelskamp maintains a lifetime score of 5% with the League of Conservation Voters, with a score of 0% in 2016. A full list of legislation sponsored or cosponsored by Huelskamp is available at Congress.gov. According to his voting record tracked at OnTheIssues, Huelskamp has consistently voted against any legislation that would combat fossil fuel emissions or climate change. [183], [184], [185], [186]

According to data from OpenSecrets, Huelskamp's top donor is Koch Industries and he has received the highest lifetime campaign contributions from the Oil and Gas industry, totally over one-quarter of a million dollars. Below are career totals added up by OpenSecrets. [187], [188]

Industry Total Contributor Total Indivs PACs
Oil & Gas $252,393 Koch Industries $40,900 $3,400 $37,500
Retired $209,441 Watco Companies $36,200 $36,200 $0
Crop Production & Basic Processing $196,178 American Bankers Assn $35,000 $0 $35,000
Republican/Conservative $167,254 B&G Production $34,400 $34,400 $0
Leadership PACs $152,163 House Freedom Fund $34,025 $250 $33,775
Health Professionals $124,755 National Assn of Home Builders $32,500 $0 $32,500
Misc Manufacturing & Distributing $102,850 Russell Stover Candies $32,200 $32,200 $0
Commercial Banks $93,600 Vess Oil Corp $30,000 $30,000 $0
Real Estate $75,310 Hodgdon Powder $28,700 $28,700 $0
Railroads $67,848 Onyx Collection $25,500 $25,500 $0
Agricultural Services/Products $65,700 KMG Tool $24,400 $24,400 $0
General Contractors $59,647 Berexco Inc $23,700 $23,700 $0
Food & Beverage $59,550 National Auto Dealers Assn $22,500 $0 $22,500
Home Builders $58,900 Citizens United $22,000 $0 $22,000
Securities & Investment $55,286 Every Republican is Crucial PAC $20,000 $0 $20,000
Misc Finance $53,250 AT&T Inc $19,500 $500 $19,000
Livestock $51,025 American Medical Assn $19,000 $1,000 $18,000
Insurance $50,613 National Assn of Realtors $19,000 $0 $19,000
Retail Sales $44,700 Ariel Corp $18,900 $18,900 $0
Lawyers/Law Firms $42,422 Ag Services $18,250 $18,250 $0

Tim Huelskamp also a signatory to Americans for Prosperity's “No Climate Tax” pledge. The pledge reads as follows:[189]

“I, ________________, pledge to the American people that I will oppose any legislation relating to climate change that includes a net increase in government revenue”

He also told HuffPost that he didn't believe that climate change was “settled “science.” [190]

“I don’t think there’s a scientific consensus on that,” Huelskamp said. “If you want to print that life begins at conception, that’s settled science.”

Stance on Climate Change

“Probably two-thirds of the warming in the 1990s was due to natural causes; the warming trend already has stopped and forecasts of future warming are unreliable; and the benefits of a moderate warming are likely to outweigh the costs.

“Global warming, in other words, is not a crisis.” [16]

“You may also know us from our work exposing the shoddy science and missing economics behind the global warming delusion. Our videos, books, studies, and international conferences changed the debate and led to the defeat of 'cap and trade.'” [17]

“Some environmentalists call for a 'save-the-day' strategy to 'stop global warming,' saying it is better to be safe than sorry. Such a position seems logical until we stop to think: Immediate action wouldn't make us any safer, but it would surely make us poorer. And being poorer would make us less safe.” [18]
“Unfortunately, global warming is an issue that is well suited to political demagoguery, which can be defined as pandering to misinformed voters and promising unrealistic solutions. Since opinion polls indicate a majority of the public believes warming is happening, politicians might think the safe strategy is to say 'I believe global warming is a serious problem and I support measures to reduce global warming pollution by supporting renewable fuels and energy efficiency.' Such politicians should be 'outed' for claiming to be smarter than scientists who have studied climate for many years and for using scare tactics to win elections.” [19]
“There is no consensus about the causes, effects, or future rate of global warming.” [20]

Funding

501(c)(3) Charitable Status

According to Heartland in 2011, “Approximately 1,800 supporters support an annual budget of $6 million. Heartland does not accept government funding. Contributions are tax-deductible under Section 501(c)(3) of the Internal Revenue Code.” As of June 2015, that number has increased to 8,300 supporters, (budget remains listed at $6 million). [21], [22]

Computer scientist John Mashey filed a complaint in 2012 with the IRS questioning Heartland's charitable status: 

“I believe there was a massive abuse of 501c(3),” Mashey said. “My extensive study of these think tanks showed numerous specific actions that violated the rules – such as that their work is supposed to be factually based. Such as there was a whole lot of behavior that sure looked like lobbying and sending money to foreign organizations that are not charities.” [23]

Mashey's 2012 report on the Heartland Institute (see PDF) also examines the finances and actions of other organizations including the Science and Environmental Policy Project (SEPP), and the Center for the Study of Carbon Dioxide and Global Change (CDCDGC).

According to his report (p. 39), the Heartland Institute has received roughly $395,000 from the tobacco company Philip Morris.

Heartland no longer reveals their individual donors, they explain, because “listing our donors in this way allowed people who disagree with our views to accuse us of being 'paid' by specific donors to take positions in public policy debates, something we never do. After much deliberation and with some regret, we now keep confidential the identities of all our donors.” [24]

Greenpeace's ExxonSecrets reports that the Heartland Institute has received $676,500 from ExxonMobil since 1998. Greenpeace also reports that Heartland received at least $55,000 from Koch Industries. [25], [26]

990 Forms

Heartland Institute as Recipient

The following is based on data from the Conservative Transparency project and from publicly available 990 forms. Not all funding values have been verified by DeSmogBlog. [27]

See the attached spreadsheet for additional information on Heartland Institute funding by year (.xlsx).  [27]

Donor Total Contributions
Donors Capital Fund $19,310,544
Mercer Family Foundation $5,088,000
The Lynde and Harry Bradley Foundation $1,215,500
Barbara and Barre Seid Foundation $1,037,977
Dunn's Foundation for the Advancement of Right Thinking $830,000
DonorsTrust $632,000
Exxon Mobil $531,500
Walton Family Foundation $400,000
Chase Foundation of Virginia $364,500
Sarah Scaife Foundation $325,000
Searle Freedom Trust $300,000
American Action Network $300,000
Barney Family Foundation1 $280,000
Friedman Foundation For Educational Choice $205,100
Jaquelin Hume Foundation $201,000
The Rodney Fund $194,000
Charlotte and Walter Kohler Charitable Trust $190,500
Stuart Family Foundation $175,000
Ed Uihlein Family Foundation $150,000
The McWethy Foundation $125,000
Castle Rock Foundation $110,000
PhRMA $90,000
JM Foundation $82,000
Charles G. Koch Charitable Foundation $62,578
Armstrong Foundation $60,000
National Association of Manufacturers $52,500
John William Pope Foundation $50,000
Windway Foundation $47,000
Arthur N. Rupe Foundation $44,000
Robert P. Rotella Foundation $42,500
The Roe Foundation $41,500
Claude R. Lambe Charitable Foundation $40,000
John M. Olin Foundation $40,000
American Petroleum Institute $25,000
Hickory Foundation $23,000
The Robertson-Finley Foundation $18,000
Woodhouse Family Foundation $12,500
The Carthage Foundation $10,000
Deramus Foundation2 $10,000
The Challenge Foundation $6,000
Foundation for Economic Education $255
Grand Total $32,722,454

1Has funded to DonorsTrust, a group that has distributed over $80 million to conservative causes, many of which deny man-made climate change.

2Has funded Philanthropy Roundtable, a spinoff of DonorsTrust and Donors Capital Fund. They all operate in a similar way to cloak the identity of donors by having donations under the name of DonorsTrust, Donors Capital Fund, or Philanthropy Roundtable.

Heartland Institute as Donor

Heartland Institute donations are listed on their 990 forms up to the year 2010. Current values are not available. [27], [28]

Recipient Total
Shimer College $500,000
Moving Picture Institute $250,000
Texas Public Policy Foundation $100,000
Americans for Prosperity Foundation $50,000
Pacific Research Institute for Public Policy $50,000
Evergreen Freedom Foundation $50,000
Maine Heritage Policy Center $50,000
James Madison Institute $50,000
International Climate Science Coalition $45,000
Galen Institute $43,000
Alabama Policy Institute $40,000
Free Enterprise Education Institute $25,000
Africa Fighting Malaria $25,000
Frontier Centre for Public Policy $25,000
Kansas Taxpayers Network $25,000
New Zealand Climate Science Coalition $25,000
Natural Resources Stewardship Project $25,000
Council for Affordable Health Insurance $20,000
Science & Environmental Policy Project $15,000
South Carolina Policy Council $10,000
Grand Total $1,423,000

Koch Funding

According to Greenpeace USA, Koch Foundations contributed $55,000 to the Heartland Institute between 1997 and 2011. [26]

*Original tax forms prior to 1997 are no longer available for verification. If you include these values, the grand total jumps to $100,000 in Koch funding from 1987 to 2011. [26]

Year Charles Koch Foundation Claude R. Lambe Charitable Foundation Grand Total
*1987 $5,000 $5,000
*1988 $5,000 $5,000
*1989 $5,000 $5,000
*1992 $10,000 $10,000
*1995 $10,000 $10,000
*1996 $10,000 $10,000
1997 $10,000 $10,000
1998 $10,000 $10,000
1999 $10,000 $10,000
2011 $25,000 $25,000
Grand Total $60,000 $40,000 $100,000

The Heartland Institute's leaked 2012 Fundraising Plan states that “The Charles G. Koch Foundation returned as a Heartland Donor in 2011. We expect to ramp up their level of support in 2012 and gain access to the network of philanthropists they work with.”

However, the Foundation since released the following statement: “… the Charles Koch Foundation provided $25,000 to the Heartland Institute in 2011 for research in healthcare, not climate change, and this was the first and only donation the Foundation made to the institute in more than a decade. The Foundation has made no further commitments of funding to Heartland.” [29]

Donors Capital Fund/DonorsTrust

Donors Capital Fund (DCF) and its partner organization DonorsTrust allow donors to fund organizations anonymously. They appear to be a spinoff of the Philanthropy Roundtable, a group run by Whitney Ball, who also launched DonorsTrust.

The Heartland Institute has received large anonymous donations through DCF and DonorsTrust, with a combined total of at least $15,391,794.

See p. 58 of the 2012 Mashey Report for more details.John Mashey also covers DCF on page 65 of his 2012 report. According to DCF's website, “Donors Capital Fund is an IRS-approved, 501(c)(3), 509(a)(3) supporting organization that is associated with DonorsTrust, a public charity and donor-advised fund formed to safeguard the charitable intent of donors who are dedicated to the ideals of limited government, personal responsibility, and free enterprise.”


DonorsTrust refers clients to Donors Capital Fund if they expect to open donor-advised funds of over $1,000,000. [30]

DonorsTrust

DonorsTrust contributed at least $631,250 between 2008 and 2012 to the Heartland Institute: [31]

Source 990 forms:

Year Project
2008 general operations $11,750
2009 general operations $1,000
general operations $1,000
general operations $3,000
general operations $5,000
“advertising in response to organization's emphasis on marketing of research.” $6,500
general operations $500,000
Total $516,500
2010 general operations $250
general operations $1,000
general operations $1,000
Total $2,250
2012 Seventh International Conference on Climate Change $100,000
general operations $250
general operations $500
Total $100,750
Grand Total $631,250

Donors Capital Fund

From 2005 to 2013, DCF contributed at least $14,760,544 to the Heartland Institute (possibly more, as some details are missing from 2006's 990): [32]

Source 990 forms:

Year Project
2005 Individual projects not listed. $550,427
2007 Individual projects not listed. $2,955,437
2008 general operations $2,000,000
“the global warming research project” $900,000
“media materials” $100,000
“staff directed research” $126,000
“final installment of three-year general ops support” $1,300,000
“global warming research projects” $184,000
Total $4,610,000
2009 “CORE” $10,590
“G.W. reporting for one year” $150,000
“health care project” $190,000
“Ranthum, Australia and Old projects” $300,000
general operations $400,000
$620,940 for “GW-end” and $500,000 for annual support $1,120,940
Total $2,171,530
2010 for the organization's India Meeting Project $14,150
general operations $1,650,000
Total $1,664,150
2011 $49,000 for the NIPCC/Climate Change Project and $80,000 for School Choice in TX $129,000
2012 general operations $1,000,000
2013 for Climatism books & DVD projects $100,000
for the Sri Fi Project ($60,000) and the  New Zealand Project ($20,000) $80,000
general operations $1,500,000
Total $1,680,000
Grand Total $14,760,544

Anonymous Donor

One Anonymous Donor has contributed a large percentage of Heartland's budget in past years, with a focus on their global warming projects.

According to the Heartland 2012 Fundraising Plan, the Anonymous Donor made the following contributions from 2007-2011:

Project 2007 2008 2009 2010 2011
General Operating $500,000 $500,000 $500,000 $700,000 $350,000
Ramp Up Program $800,000 $800,000 $400,000 $0 $0
Global Warming Projects $1,976,937 $3,300,000 $1,732,180 $964,150 $629,000
Health Care $0 $0 $190,000 $0 $0
School Reform $0 $0 $0 $0 $80,000
Total $3,276,937 $4,600,000 $2,822,180 $1,664,150 $979,000

Illinois auditor reports for 2003-2009 reveal that a single donor (possibly the same individual as the “Anonymous Donor”) contributed the following percentages of outstanding accounts receivable in those years (also see p. 56 of John Mashey's report):

2004 (PDF — See p. 27) — 74% contributed by two donors.

2005 (PDF — See p. 32) — 74% from one donor.

2006 (PDF — See p. 33) — 25% from one individual.

2007 (PDF — See p. 32) — 38% fr

          Ruby on Rails        
Recently I was working on a XML feed generation task in a RoR project. I did a cursory performance analysis recording file size against time. The interesting thing was the time taken to generate file increased exponentially with increasing file size. So I decided to compare ruby's performance against Lava, C, C++, Scala and Javascript. I have tabulated the results of the benchmark tests from http://shootout.alioth.debian.org.


The results were surprising though hardly any N Body solutions would be implemented with Ruby in reality (included just for visualising performance @ heavy computation load).

I realized that RoR is ideal for projects that don't have computation heavy / memory intensive operations in backend. Or even, we can build such backend systems in other languages that are cpu & memory efficient and have frontend built with RoR. We shouldn't be expecting RoR to do that.

I am not undermining RoR but I think we should contemplate on nature of the system that we will be building.

          How to find out what is using a specific account.        

I have attached the spreadsheet exactly as the script created it. The only change I have made is to the computer name itself.

You can see where it forces to carry to a new line with the ?????.


          Manage Your Product Content With the Free Icecat PIM        
Download this post in PDF format As a manufacturer, managing your product information can be a huge challenge: new pictures, expiring visual rights, awards to be added, a revised marketing text, some spec updates… How to make sure that your product managers are fully aligned and get the most up-to-date content? Spreadsheets and email chains … Continue reading Manage Your Product Content With the Free Icecat PIM
          Packaging Engineer - Lactalis American Group - Buffalo, NY        
Microsoft Excel Spreadsheet software and Microsoft Word Processing software and Power Point software, Palletization software (TOPS or CAPE), AutoCAD....
From Lactalis American Group - Thu, 22 Jun 2017 21:02:49 GMT - View all Buffalo, NY jobs
          4 Spreadsheet Alternatives to MS Excel        

There was a time when office compatibility was a bit of a problem on Linux, but with the latest office suites out there available for Linux, this is not an issue anymore. The applications here mimic MS Excel’s behavior, so switching to one of them should be pretty straightforward.

read more


          Jo Daviess Last Will and Testament, 1829-1926        
FYI, FamilySearch.org has all the images of wills from Books A thru M posted on their website. Unfortunately, it is not indexed.

I've been working to create an index in google docs. It's a work in progress, but some may find it helpful.

If interested, it can be viewed at:

https://docs.google.com/spreadsheet/pub?key=0AswlLr1jA48UdDZ...



          Never let them see you sweat        
Houston Grand Opera Chief Financial Officer Rauli Garcia gets a taste of stage life as an actor in Dead Man Walking
I arrived a little early to a new kind of chaos. Some new cast members had arrived …  I asked a super who they were: “the superstars, man,” he said. The principal singers were joining us. It was really cool to see that they looked like anyone else while off the stage. I’m not sure if I expected anything different, they were just hanging out. I also learned that the other supers who I thought were actors were just normal people like me. They work full-time jobs, and then come to Houston Grand Opera in the evening. They do it because they love opera or the stage life, and this is a good way to be involved. 

Artists arrive for the biggest rehearsal yet.

I went to the far side of the room to watch as people began to file in. The men’s chorus, children’s chorus, supers, musicians, singers, stage mangers, and the production team arrived in droves. I had never seen so many people in that space before.
We began to go through the first act. It was amazing to hear it so closely. It was loud, in a good way. The voices of the principal singers were magnificent.  I have heard Joyce DiDonato on stage, but hearing her up-close was much more intense. This is an emotional opera to begin with, and the singers look like they are really feeling the emotions as they rehearse. 

Later, we went through the scene that we had practiced with the entire group. The one with the synchronized steps (“Left, right, left, right …”). It became even more complicated when we added the chorus and singers. The music was so loud that I couldn’t hear the cue to begin my march. I was late, which made the other supers late, which made me nervous. We continued through much of the act. 

At the next rehearsal, the supers were called with just the principal singers and the men’s chorus.  We went through several scenes, and in my mind, I focused on the scene I have been concerned about: the marching scene. We went through it several times. Each time, someone was off. I felt like usually it was me, and I was missing the cue. I was starting to sweat! This time there were more singers, and more people marching, and yes, more complexity. I offered to step out of that scene, so far, I’m still there.
Isn’t there a way for me to put this into a spreadsheet?!?  That would solve everything.

A note about the patience I see in the rehearsal room.  Maestro Patrick Summers, HGO Music Director and the conductor of Dead Man Walking and Leonard Foglia, the show’s director, show boundless patience while putting the opera together. The process is very detailed. There are many pauses in the rehearsal process. Stage Management yells, “Hold Please!” and everyone stops. Then Patrick or Lenny ask for small changes to be made, we back up a few moments, and start over again. It happens over and over as they tweak and adjust the production into alignment with their vision. 
I, for my part, will keep my head down and continue do what I’m told.

          Robert Bryce        

Robert Bryce

Credentials

  • B.F.A., University of Texas at Austin (1986). [1]

Background

Robert Bryce is an American author and journalist based in Austin, Texas. [17] He has regularly been cited as an “expert” on energy issues in the media, but has been under increased scrutiny after writing numerous articles in media outlets that did not disclose his ties to the fossil fuel industry. [16]

Many of Bryce's articles have been on the energy business. He spent 12 years writing for The Austin Chronicle. From 2006 to September 2010 he worked as the managing editor of the online publication Energy Tribune. [17]

From October 2007 to February 2008 he was a fellow at the Institute for Energy Research (IER). In April, 2010 he joined the Manhattan Institute as a senior fellow in its Center for Energy Policy and the Environment. [2], [17]

The Manhattan Institute is a policy think tank that has received significant funding from both ExxonMobil and Koch Industries. According to media transparency, the Manhattan Institute has been known to “obscure” science supporting man-made climate change. [3]

Bryce has been unwilling to answer questions about the funding the Manhattan Institute receives from the fossil fuel industry. [15]

Stance on Climate Change

“The science is not settled, not by a long shot. […] If serious scientists can question Einstein’s theory of relativity, then there must be room for debate about the workings and complexities of the Earth’s atmosphere. Furthermore, even if we accept that carbon dioxide is bad, it’s not clear exactly what we should do about it.” [4]

“On the science of global climate change, I'm an agnostic. I've seen Al Gore's movie, and I've read reports from the Intergovernmental Panel on Climate Change. I've also listened to the 'skeptics.' I don't know who's right.” [5]

Key Quotes

“[W]e should be cheering the news that coal use is rising. For it means that more people are escaping the darkness and joining the modern world.” [25]

”[…] the job [at the Manhattan Institute] gives me a platform where I can focus on the themes that I explored in both Gusher of Lies and Power Hungry: that the myths about “green” energy are largely just that, myths; that hydrocarbons are here to stay; and that if we are going to pursue the best “no regrets” policy with regard to energy, then we should be avidly promoting natural gas and nuclear energy.” [6]

“It’s time to move the debate past the dogmatic view that carbon dioxide is evil and toward a world view that accepts the need for energy that is cheap, abundant and reliable.” [4]

Key Deeds

May 15, 2016

Robert Bryce published an op-ed in the Wall Street Journal titled “An Ill Wind: Open Season on Bald Eagles“ criticizing new regulations put in place by the U.S. Fish and Wildlife Service (FWS) governing the accidental harming of bald and golden eagles. According to Bryce, the FWS is “trying to make it easier for the wind industry to kill” eagles. [19]

On May 6, the blog Daily Kos predicted Bryce's piece, saying Bryce and the Journal would likely contribute a “fresh round of fossil fuel-penned pieces crying crocodile tears for birds” shortly after the FWS regulations were announced: [20]

“Bryce wrote op-eds attacking wind power in February, October and November 2013, which are all similar to one he wrote in 2009, and just like what he wrote in 2015. Since he already attacked wind power back in February of this year, one might think the WSJ editors wouldn’t want to go back to him for essentially a rerun of the same op-ed. But the WSJ has published over twenty of his pieces since 2009, all of which are either explicitly anti-wind or pro-fossil fuels,” Daily Kos writes.

Media Matters notes that Bryce's Op-Ed piece is “misleading”: [21]

“Bryce complained that the new rules would allow wind energy producers to kill or injure up to 4,200 eagles per year and hyped data showing that wind turbines were responsible for about 573,000 total bird deaths (not just eagles) in 2012. But as the Daily Kos piece explained, it is misleading to cite these figures without explaining that wind turbines are responsible for only 'about 3 percent of human-caused eagle deaths' and that other factors – including the oil and gas industry and climate change – are a much greater threat to birds than wind energy.”

May 4, 2016

Robert Bryce authored a Manhattan Institute report titled “What Happens to an Economy When Forced to Use Renewable Energy?” (PDF). [28]

Bonner R. Cohen promoted the new study at the Heartland Institute. He writes that policies to combat climate change in Europe “have led to soaring electricity costs for residential and commercial customers, leading the authors to recommend the United States reject similar policies.” [29]

“To avoid the kinds of results seen in Europe, U.S. policymakers at the federal and state levels should be required to do rigorous cost-benefit analyses before imposing renewable-energy mandates,” Robert Bryce said. “U.S. policymakers must also consider the impact higher energy costs will have on overall employment and industrial competitiveness.” [29]

February 7, 2016

Robert Bryce published an op-ed in the Wall Street Journal “attacking [Bernie] sanders' renewable energy plan,” reports Media Matters. [23]

In the op-ed, Bryce claimed that Sanders “better check with his Vermont constituents about the popularity of wind energy.” He adds, “Nowhere is the backlash [against wind energy] stronger than in Mr. Sanders's state.” [24]

According to Media Matters, Bryce's statement is not accurate: “ despite the presence of a vocal minority who oppose large-scale wind projects, support for wind energy development is actually very strong in the Green Mountain State.” They cite a 2014 poll by Fairbank, Maslin, Maulin, Metz & Associates for the Vermont Public Interest Research Group (VPIRG) that found 71 percent of Vermonters support building wind turbines along the state's ridgelines, while only 23 percent oppose wind energy development.  [23]

June 22, 2015

Robert Bryce wrote a column in the National Review titled “The Poor Need More Energy: What BP Knows and Pope Francis Doesn't,” where he  maintained that the best, low-cost energy source for developing countries is coal. [14]
 
According to Bryce, “[Pope Francis's] new encyclical on climate change, Laudato Si’ (Be praised), shows a shallow understanding of global energy use and, in particular, of how energy consumption is soaring among the people he claims to care most about: the poor.” 
“But if developing countries are going to prepare for possible changes in the climate, they will have to get richer so they can afford to deal with any calamities that may occur. And how will they get richer? The answer is obvious: by consuming more energy. And for countries throughout the developing world, the lowest-cost energy is still coal,” Bryce writes. [14]

February 25, 2014

Robert Bryce testified before the Senate Committee on the Environment and Public Works where he contends that “federally subsidized efforts that are being undertaken to, in theory, address climate change, are damaging America’s wildlife.” [26]

Bryce focused his attacks the wind industry: 

“Given the studies already done on wind energy’s deleterious impact on wildlife, combined with the 'energy sprawl' that will come with the industry’s continuing expansion, it is virtually certain that as the wind sector adds more turbines, more federally protected wildlife – including more bald eagles, an animal that has been on the Great Seal of the United States since 1782 – will be killed.[xl] And thanks to the production tax credit, taxpayers will be subsidizing the slaughter.

The question at hand is obvious: why are policymakers implementing an energy policy that is a known killer of wildlife in exchange for what are infinitesimally small reductions in carbon dioxide emissions?” [26]

October, 2011

An October 2011 Petition submitted by the Checks and Balances Project complained about Bryce, pointing to “a disturbing trend of special interests surreptitiously funding “experts” to push industry talking points in the nation’s major media outlets.” [7]

DeSmogBlog reported on this issue here, and here.

According to the letter, “pundits like Mr. Bryce have the right to share their views, but we believe media outlets have the responsibility to inform their readers of opinion writers’ true ties and conflicts of interest.”

It appears that an Op-Ed by Bryce titled “the Gas is Greener” which criticizes renewable energy including wind projects and reports to expose hidden costs and “deep contradictions” in the “renewable energy movement.” [8]

Signatories asked the New York Times to set the standard by revealing the ties of these “expertS” and ensuring readers get the full story.

New York Times editor Arthur S. Brisbane responded, dismissing the petition's request and saying that “I don’t think Mr. Bryce is masquerading as anything: experts generally have a point of view. And the Manhattan Institute’s dependence on this category of funding is slight — about 2.5 percent of its budget over the past 10 years. But the issue of authorial transparency is an important one, albeit one that isn't always simple.” [9]

August 11, 2011

According to SourceWatch, Bryce was a featured speaker at the 2011 American Legislative Exchange Council (ALEC) Annual Meeting at a workshop titled “Unconventional Revolution: How Technological Advancements Have Transformed Energy Production in the United States.”

The panel advocated the process of fracking for reaching unconventional gas reserves. Bryce has also published articles in favour of fracking and in one example where he presents the often-repeated industry claim that fracking poses “minimal risk” to groundwater.  He stressed that New York “can’t afford to be left behind in the shale revolution.” [18], [10]

In a June 13, 2011 piece published in the Wall Street Journal he wrote that the “shale revolution now underway is the best news for North American energy since the discovery of the East Texas Field in 1930.” [11]

May 12, 2010

Bryce wrote an Op-Ed in the New York Times revealing his opposition to the implementation of carbon capture technology. 

He was particularly critical of a senate energy bill introduced by John Kerry and Joseph Lieberman which would include incentives of $2 billion per year for carbon capture and sequestration.

Bryce wrote “That's a lot of money for a technology whose adoption faces three potentially insurmountable hurdles: it greatly reduces the output of power plants; pipeline capacity to move the newly captured carbon dioxide is woefully insufficient; and the volume of waste material is staggering. Lawmakers should stop perpetuating the hope that the technology can help make huge cuts in the United States’ carbon dioxide emissions.”

He also predicted public opposition to carbon dioxide sequestration areas, writing how “few landowners are eager to have pipelines built across their property. And because of the possibility of deadly leaks, few people will to want to live near a pipeline or an underground storage cavern. This leads to the obvious question: which members of the House and Senate are going to volunteer their states to be dumping grounds for all that carbon dioxide?” [12]

April 8, 2009

Wrote an article titled “Let Exxon Run the Energy Dept.” in The Daily Beast. The article is strongly critical of the Obama Administration which he claims is “working to marginalize America's single biggest sector, the sliver of the economy that produces our most essential commodities: gasoline, diesel fuel, jet fuel, coal (which provides about half of the country’s electricity) and natural gas.”

Bryce writes “the U.S. has never had a secretary of Energy who has actually drilled an oil well, built a nuclear power plant, or dug coal out of the ground. Indeed, actual experience in the energy business appears to be grounds for disqualification. This is stunning.”

In conclusion, Bryce suggests that maybe we should include more people representing the energy industry in government: “Maybe—just maybe—those energy companies aren’t so villainous after all. And here’s another wacky thought: Maybe—just maybe—we should have a few people in government who really understand how the energy business works.” [13]

Affiliations

Publications

Many of Robert Bryce's recent articles can be viewed at RobertBryce.com. In 2011, Media Matters noted 39 times that Robert Bryce appeared in the media where sources failed to mention his ties to the oil industry. [3]

Bloomberg

Media Matters reports that Bloomberg has published several columns by Robert Bryce without disclosing that he is a senior fellow at the Manhattan Institute's Center for Energy Policy and the Environment, which has received significant funding from ExxonMobil. Some of Bryce's cited publications below: [22]

Manhattan Institute

Some sample Manhattan Institute Publications below:

In the past, the Manhattan institute also listed “research” by Robert Bryce: [27]

Books

Other Recent Publications

Robert Bryce has contributing hundreds of articles to multiple news sources, often attacking renewable energy sources and environmentalists while encouraging the use of fossil fuels and coal power. View the attached spreadsheet for samples of Robert Bryce's latest publications (.xlsx).

Resources

  1. “Robert Bryce,” Profile at the Manhattan Institute for Policy Research. Archived May 26, 2016.

  2. “Bio,” Robertbryce.com. Archived May 26, 2016. WebCite URL: http://www.webcitation.org/6hmTUbf60

  3. “Who Is Robert Bryce?” Media Transparency, October 7, 2011. Archived May 26, 2016.

  4. “Five Truths About Climate Change,” The Wall Street Journal, October 6, 2011. Republished by the Manhattan Institute for Policy Research. Archived July 6, 2012. Archived .pdf on file at DeSmogBlog.

  5. Robert Bryce. “If More CO2 Is Bad … Then What?” The Austin Chronicle, December 7, 2007. Archived May 26, 2016.

  6. Robert Bryce. “Farewell: My Final Column for Energy Tribune,” September 30, 2010. Archived May 26, 2016.

  7. “Letter To The New York Times,” TrueTies, October 6, 2011. Archived May 26, 2016.

  8. Robert Bryce. “The Gas Is Greener,” The New York Times, June 7, 2011. Archived .pdf on file at DeSmogBlog.

  9. Arthur S. Brisbane. “The Times Gives Them Space, but Who Pays Them?” The New York Times, October 29, 2011. Archived .pdf on file at DeSmogBlog.

  10. Robert Bryce. ”Phony Franking Fears for NY,” New York Post, December 15, 2011. Archived July 6, 2012. Archived .pdf on file at DeSmogBlog.

  11. Robert Bryce. ”America Needs the Shale Revolution,” Wall Street Journal, June 13, 2012. Archived .pdf on file at DeSmogBlog.

  12. Robert Bryce. “A Bad Bet on Carbon,” The New York Times (Opinion), May 12, 2010.

  13. Robert Bryce. “Let Exxon Run the Energy Dept.” The Daily Beast, April 8, 2009.

  14. Robert Bryce. “The Poor Need More Energy: What BP Knows and Pope Francis Doesn't,” National Review, June 22, 2015. Archived September 5, 2015.

  15. Brendan DeMelle. “Accountability Moment: Manhattan Institute's Robert Bryce Squirms And Evades Question on Fossil Fuel Funding,” DeSmogBlog, February 9, 2012.

  16. Farron Cousins. “Robert Bryce – The Media’s Industry-Funded Go-To Guy,” DeSmogBlog, October 12, 2011.

  17. “About Bryce,” RobertBryce.com. Archived October 3, 2010. Archived .pdf on file at DeSmogBlog.

  18. Robert Bryce. “How fracking lies triumphed,” New York Daily News, January 22, 2012. Archived May 26, 2016. WebCite URL: http://www.webcitation.org/6hmSizDSE

  19. Robert Bryce. “An Ill Wind: Open Season on Bald Eagles,” Wall Street Journal, May 15, 2016. Archived .pdf on file at DeSmogBlog.

  20. “The Birds and the ‘Bines: Wind Turbine Regulations Revised,” Daily Kos, May 6, 2016. Archived May 26, 2016.

  21. Andrew Seifter. “Big Oil Cheerleader Robert Bryce Predictably Misleads On Wind Energy And Eagle Deaths In WSJ,” Media Matters for America, May 16, 2016. Archived May 26, 2016. 

  22. “Why Has Bloomberg Given Robert Bryce A Platform To Attack Renewables Without Disclosing That Oil Pays His Salary?” Media Matters for America, June 9, 2015. Archived May 26, 2016.

  23. Andrew Seifter. “Bernie Sanders' Wind Energy Plan Falsely Attacked By Big Oil Ally, With Help From The Wall Street Journal,” Media Matters for America, February 8, 2016. Archived May 26, 2016.

  24. “The Windmills of Bernie's Mind,” The Wall Street Journal. Republished by the Manhattan Institute, February 8, 2016. Archived .pdf on file at DeSmogBlog.

  25. Robert Bryce. “Coal use is soaring – that's good news,” The Hill, December 22, 2014. Archived May 26, 2016. WebCite URL: http://www.webcitation.org/6hnZoXg9b

  26. “U.S. Senate Testimony: Killing Wildlife in the Name of Climate Change,” RobertBryce.com, February 25, 2014. Archived May 26, 2016. Webcite URL: http://www.webcitation.org/6hnaxQpe1

  27. “Robert Bryce,” Manhattan Institute. Archived July 6, 2012. Archived .pdf on file at DeSmogBlog.

  28. Robert Bryce, “What Happens to an Economy When Forced to Use Renewable Energy?” (PDF), The Manhattan Institute, May 4, 2016. Archived .pdf on file at DeSmogBlog.

  29. Bonner R. Cohen. “Study Shows the High Economic Costs of Renewable Energy,” Heartland Institute, June 14, 2016. Archived June 25, 2016. WebCiteURL: http://www.webcitation.org/6iXTFFgPV

Other Resources

File attachments: 

          Comment on EPI 114 | Bookkeeping requirements for property investors by Kaz        
Thanks for your comments, appreciate your input. I am also a fan of the basic spreadsheet myself. Some people like other solutions that take away almost all effort! I thought it was worth exploring with Penni on the show. Xero is a great product, we use it for the accounting side of our business but I'm a bit like you, think the spreadsheet does me just fine! Cheers Kaz
          Comment on Renovation Update #6 – Do the work by Kaz        
Hi Ty, thanks for your question! The project plan was a MS Project Gantt Chart (I used to be the MS Project Queen in my previous corporate role!). Unfortunately I don't know that I have that project plan at all anymore as it was a few years ago now - in fact I don't actually own windows based computers anymore so I don't have MS Project! Basically, what I did was just created a room by room outline and the filled it in with the tasks required in each of those rooms/areas. A gantt chart can get tricky when you start to add in dependencies and effort/duration etc. If you're familiar with gantt charts, then jump in! There are plenty of free online software applications you can use to create them - but if it sounds a bit daunting then just using a spreadsheet can work as well or even just a massive 'To do' list! Cheers Kaz
          Comment on EPI 087 | Real success in property with Elise Parker by Kaz        
Hi Stan, Some people use standard accounting packages such as MYOB, Quicken or Xero and find those to be good. Me, I just use a spreadsheet as I find it quicker and simpler to enter data - if something is time consuming or complex then I'm less likely to do it! There are some property specific software packages - such as My Property Tracker (http://mypropertytracker.com.au/) or Real Estate Investar (http://www.everydaypropertyinvesting.com/link/realestateinvestar) - some of these such as Real Estate Investar offer analysis, search and valuation services as well. You've given me a good idea for a blog post - to review all of the available property software packages around - will get onto that! Kaz
          Comment on EPI 087 | Real success in property with Elise Parker by Stan Hamilton        
Hi Kas I wondering are there any good computer programs for storing all your details about your rental properties, like MYOB or Quicken, I use a Spreadsheet now but feel some sort of Accounting Program may also be beneficial. Can you recommend one or some for me, I want to better control the facts , figures and Information my Property Portfolio generates. Thank You Stan Hamilton Mackay, Qld
          Preparing for the Blaggers' Banquet and musings on the power of Twitter        


"There are known knowns; there are things we know we know. We also know there are known unknowns"

Donald Rumsfeld, that old rogue, may have uttered these words in a rather different context but there were moments last week when I thought I was planning a military campaign, not a fundraising banquet. And believe me, there were a scary number of known unknowns in the run-up to the philanthropic feast that was the Blaggers' Banquet

Amusing as Rummie's comment is, in writing this blog post it seemed apt to turn instead to the image of Wonderwoman for inspiration. I can't convey to you how much I wished I had superhero qualities like hers last week, yet lo and behold it turned out the team of Blaggers' Banquet organisers evinced some of their own action hero qualities resulting in a banquet that has thus far been universally praised

Planning the banquet was a team effort, led by:
Niamh of Eat Like A Girl

and pulled together by:

Kavita Favelle
Mathilde Deville
Denise Medrano
Susie Sandford Smith
Ailbhe Phelan
Linda Williams

When Niamh asked me a few weeks ago to be in charge of the Blaggers' Banquet kitchen I naively assumed all I had to do was put together a menu and gather some ingredients. "Piece of cake, time for a cocktail!" I thought

Then Niamh told me what had already been blagged in terms of ingredients. And then I heard the price tag for each ticket:

£75 for a 4-course meal including wines

Gulp. So no ordinary dinner party then

Rather than share with you the complete picture of what happened in the run-up to the banquet (trust me, it's a long story) I thought I'd proffer a few anecdotes from this past week of preparations to give you an idea of how we pulled the event together, focussing especially on those suppliers I worked with

As an anthropologist I've been keeping a close eye on Twitter in recent months and last week the potency of this "new media" really hit home for the first time. Twitter became the single most useful medium for us to galvanise banquet volunteers, tempt guests to part with that £75 and crucially from my perspective as head chef for the event - a brilliant tool in sourcing ingredients at short notice. Below I've done a chronological countdown to D-Day, or B-Day as Niamh might say, and have copied in a few tweets to give you an idea of how dynamic the process of planning this banquet became. Click on #blaggersbanquet and you can see for yourself how many thousands of tweets the banquet generated


Friday 6th November:
meet with Niamh at Borough Market to discuss strategy. Umpteen cups of coffee later and glares from Monmouth staff as we take up their table, we've got a spreadsheet to assign roles to all the bloggers. Somehow I manage to double-book myself and the inimitable MsMarmitelover joins us for part of the strategy session when she and I are meant to be meeting discuss the menu for her forthcoming Umami Night

I tweeted that Niamh and I have a plan of action in place:

Great plans afoot for #blaggersbanquet the fun part of creating a menu now begins!
9:02 PM Nov 6th from TweetDeck

Off I totter back to Bloomsbury, confident that we can pull this off in ten days' time. A quiet weekend beckons and that evening the Man and I get drenched in monsoon-like rain going to the cinema. Spend weekend lurgified. Awesome

Monday 9th November: time to tackle the Google spreadsheets the bloggers have set up for the banquet:

two university degrees and proficiency in four languages are clearly of no use in dealing with excel spreadsheets, am totally befuddled
2:38 PM Nov 9th from TweetDeck

Baking bread is one of my core skills, Excel spreadsheets most definitely not. Start tackling recipes for the menu instead. What's going to be most effective on the night both in terms of mise-en-place and for the kitchen team? Hawksmoor's kitchen reputedly tiny so we need a menu that won't collapse under the stress of serving 50 diners

Having a dyslexic's nightmare reading through recipes for Sunday's #blaggersbanquet resorting to ruler to read each line!
3:52 PM Nov 9th from TweetDeck

In a moment of mischief I think to myself "what if we made the front of house bloggers wear catsuits for the banquet?" I suspect that might bring in the punters and tweet:

Contemplating whether I can get away with asking bloggers to wear catsuits while serving at Sunday's #BlaggersBanquet
5:36 PM Nov 9th from TweetDeck

To which I receive almost immediately several tweets from the bloggers telling me mutiny is nigh if I enforce the catsuit rule. Spoilsports

Have sourced delicious fresh goat's and chived goat's cheese from Sarah at Brockhall Farm being a curd-nerd this is fantastic news

Tuesday 10th November: still grappling with menu and spreadsheets, trying to figure out which ingredients have actually been blagged and are confirmed, which have just been inserted by the bloggers but not yet confirmed. Turns out we have a major problem: where to store large quantities of meat and other ingredients in need of refrigeration. Hawksmoor's fridges are full to the brim so Niamh tweets:

RT @eatlikeagirl: We urgently need somewhere that can offer us some refrigerated storage for lots of meat for a few days #blaggersbanquet
1:45 PM Nov 10th from TweetDeck

This evolves into an ongoing problem all week. Next time the banquet shall only feature chestnuts and mushrooms

I talk to the good people at Fish For Thought about sourcing seafood for the banquet. They prove to be so helpful that they hand-deliver monkfish to Hawksmoor on Sunday AM. More on this in the next blog post about the banquet itself

Good news about desserts that Tuesday afternoon:

#blaggersbanquet dessert is now in the bag! Thanks to lovely @TrishDeseine for donating chocolate and her favourite chocolate cake recipe
1:54 PM Nov 10th from TweetDeck

Being in charge of the banquet puddings I do a little dance. And pop a piece or three of chocolate in my mouth to celebrate

The question of non-alcoholic soft drinks then comes up. Some confusion as to whether this has been blagged. Google spreadsheet not giving clear answer, so I tweet:

#BlaggersBanquet we're looking for non-alcoholic soft drinks for Sunday. DM me if you have a contact for cordial/soft drink Co. Thanks!
4:43 PM Nov 10th from TweetDeck

Again, almost immediately I receive three replies. I choose Firefly Tonics and within minutes we have soft drink supplies for Sunday sorted. Emma Dalglish at Firefly offers to donate 144 bottles to the event and is nothing but helpful throughout the day as we discuss deliveries and quantities. Makes a contrast to some suppliers I had been dealing with up to that point

The question of bread arises. Sourcing cheese isn't proving problematic, but bread is. Chris Young at the Real Bread Campaign kindly offers to contact Real Bread suppliers to see if any will donate

I ask my friend Wendy at Peter's Yard if she could spare us some crispbread for the banquet and within minutes she's replied saying yes. Not only does she say yes but offers two fantastic bread books as prizes for the banquet, and goes to bat asking some of her contacts for further ingredients. I tweet:

Oh the wonderful people @PetersYard have promised their delicious crispbread for #blaggersbanquet * thank you *
8:14 PM Nov 10th from TweetDeck

We also hear from Kate at Lahloo Tea that she will donate green jasmine tea and a tea chocolate by Damian Allsop to the banquet. Close of play on Tuesday I tweet this to thank them all

Many thanks to @brockhallfarm @PetersYard @lahlootea @FireFlyTonics + @foodforthink for helping this non-blagger out w/ #blaggersbanquet !!
9:05 PM Nov 10th from TweetDeck

Wednesday 11th November: the final push I had hoped would be today. How wrong I was, we would still be sourcing ingredients up to the last minute:

Good morning all, putting in final orders this morning for #blaggersbanquet VERY excited about Sunday :)
11:32 AM Nov 11th from TweetDeck

Some banter by fellow tweeters at my accepting deliveries of Brockhall Farm cheese in the next few days - apparently I have a reputation for being a greedy cheesemonster:

@goodshoeday you're suggesting I would eat all of @brockhallfarm 's cheese before Sunday?! Do you know how much we're getting?!
12:25 PM Nov 11th from TweetDeck in reply to goodshoeday

The remainder of Wednesday was spent locked away in a film studio near Kentish Town as part of filming for the final of Gordon Ramsay's F-Word. Predictably, I turned my phone off for the day and all hell broke loose with sourcing the main course for the banquet. Some suppliers just didn't getting back to me, others insisted on being the sole supplier. Rather shockingly I fielded several enquiries for free tickets, requested without a trace of irony. Erm, this may be the Blaggers' Banquet, but isn't it somewhat shameless to blag tickets when it's for charity?

Thursday 12th November: Catching up on emails from Wednesday, now having to scramble for supplies. Increasingly grumpy at how problematic it is dealing with some PRs when earlier in the week the PRs and suppliers I dealt with were straightforward and incredibly helpful

Unduly excited by email from the School of Artisan Food - they offer a short course in either baking, cheesemaking, brewery, butchery or preserving as an auction prize for the banquet. Result! I might just bid on that prize when it goes online

Pong cheese arrives, and boy is it a pongy selection:

Pong cheese for #BlaggersBanquet has arrived thank you good people of @PongCheese the flat is rather fragrant ;)
12:57 PM Nov 12th from TweetDeck

It takes every ounce of willpower not to tuck into Pong's finest - I love stinky cheese. As soon as I've cleared space in my little fridge the next delivery comes:

And lovely goat's cheese from @brockhallfarm has just arrived, along with fresh goat's milk for the #BlaggersBanquet *thanks Sarah*
1:32 PM Nov 12th from TweetDeck

Earlier in the week I realised we had hardly any dairy products and emailed the bloggers to see if they might have contacts for milk, cream, butter, eggs and creme fraiche. This was tweeted a number of times when suddenly I had a DM (direct message) from Sainsburys who follow me on Twitter. I couldn't believe it - they offered to source all our dairy supplies, no questions asked, no demands made. Phew!

A HUGE thank you to Rhona @sainsburys and Tim Whirledge for helping us source dairy for Sunday's #BlaggersBanquet !
3:19 PM Nov 12th from TweetDeck

Most of the Sainsbury's dairy arrived in increments and much to the bemusement of my flatmate - clearly if I ever design my own kitchen I should have a walk-in fridge for storing dairy if this banquet becomes an annual event

In the midst of sourcing ingredients, there were also generous offers from food producers and the likes of the lovely Petra Barran of Choc Star Van fame to donate their time for "experiential" prizes in the online auction:

RT @ChocStarVan: Experiential #blaggersbanquet prize - special Choc Star dessert delivered in choc-mobile to dinner of 12 anywhere in LDN
6:20 PM Nov 12th from TweetDeck

That auction started yesterday and Petra very kindly offered to drive to a lucky winner's house anywhere within metropolitan London to supply them with a dinner party chocolate bonanza from her van of chocolatey goodness. You can bid on this brilliant prize here and all proceeds go directly to Action Against Hunger!

Thursday's other huge problem was sourcing fridge space:

Urgent request for fridge space to store 20kg meat for this Sunday's #BlaggersBanquet !! Please RT
6:59 PM Nov 12th from TweetDeck

Which proved frustratingly elusive. We'll know that for next time

Then India Knight of The Times very sweetly plugged the banquet on our behalf:

RT @eatlikeagirl: thanks @indiaknight for your posterous link & RT - http://bit.ly/2fD5mQ - we'll be doing it again! :) #blaggersbanquet
8:13 PM Nov 12th from TweetDeck

The boyf did a requisite tweet at my behest about getting his ticket for the banquet (in exchange for some cinnamon loaf I believe) :

RT @twilliams81: have my ticket for Blaggers Banquet. SOOOOO excited to go #blaggersbanquet (envious of you feasting while I'm cooking!)
8:21 PM Nov 12th from TweetDeck

Sadly he was ill over the weekend so didn't make it which was a real disappointment :-( Next time Thomas!

The question of sourcing bread was proving to be as problematic as sourcing fridge space and out of the blue I decided to contact St John Restaurant to see if they could spare us some. I DM'd the restaurant and within minutes had a reply. One phone call later and we were promised 12 loaves of St John's finest sourdoughs, and they went down a treat on Sunday night with the main course stews! I tweeted:

Excellent news, we have delicious sourdough bread from @SJRestaurant donated for Sunday's #BlaggersBanquet - thank you so much :)
8:23 PM Nov 12th from TweetDeck

That night I dashed over to Marylebone and took a break from the banquet planning to be inducted into The Marmarati a society of Marmite lovers which turned out to be the most brilliant PR event I'd been to in a long time. Was good to get away from thinking about the banquet for a few hours, and I raffled off a copy of the brand-spanking new Big Bumper Book of Marmite (Absolute Press) at Sunday's banquet to boot!

Friday the 13th proved to be the most testing day of the week. There was still a fair amount of menu planning going on at this stage as several supplies we'd been promised didn't pan out, or the offers rescinded. The main course supplies were resolved, and we had guarantees from both Riverford Organic and Abel & Cole to supply us with all the veg and fruit we needed for Sunday. I took delivery of Billington's sugar for the banquet dessert and coffee

Trethowan's Dairy had generously donated a 2kg Gorwydd Caerphilly for the banquet cheese board, but the question arose of who could collect it from Borough Market:

RT @goodshoeday: We need someone to go collect a 2kg cheese from borough today or tomorrow for #blaggersbanquet volunteers pls
11:57 AM Nov 13th from TweetDeck

Simon Majumdar, one half of fraternal blogging duo Dos Hermanos offered to go collect it for us, even though he couldn't attend the banquet and delivered the mighty cheese to Hawksmoor on our behalf, which we're incredibly grateful for

The Follow Fridays flooded in that day, and Niamh tweeted a shout out to the banquet team:

RT @eatlikeagirl: OK! My first #FF @scandilicious , @MathildeCuisine, @KaveyF & @sosusie - the stellar organisers for the #blaggersbanquet
12:32 PM Nov 13th from TweetDeck

Meanwhile I was still fielding comments about my cheese obsession from these two jokers:

@josordoni @goodshoeday are you two casting aspersions AGAIN?! Cheese induced coma? Jar of marmite and sourdough in my tiny hands?! Sheesh..
1:11 PM Nov 13th from TweetDeck in reply to josordoni

Then disaster struck:

About to get in shower betw/deliveries of #blaggersbanquet ingredients when door rang, ran downstairs+locked myself out. In dressing gown.
5:26 PM Nov 13th from web

Yes, I was in my dressing gown in the middle of the afternoon (all times on the tweets incidentally are Tehran time!) as I had been taking deliveries all morning and hadn't had a chance to shower or get dressed in between the emails, phone calls and tweets about the banquet. Needless to say this was hilarious for everyone on my street, not so hilarious for me

Anyway, I owe Jeni, my lovely flatmate and Kennards a big thank you:

Phew, back in flat! owe flatmate who ran home from work and the lovely people at Kennards, local deli next door a massive favour *thank you*
5:47 PM Nov 13th from TweetDeck

More deliveries ensued throughout Friday:

Barber's delicious 1833 Vintage reserve Cheddar occupying top part of flatmates' fridge - very tempting to break into before Sunday ;)
6:15 PM Nov 13th from TweetDeck

just taken delivery of @PetersYard 's fabulous crispbread we're using for canapes and cheese course at #blaggersbanquet *thanks Wendy + Ian*
6:38 PM Nov 13th from TweetDeck

Billington's sugar arrived for #blaggersbanquet now need to meet w/cooks @HawksmoorLondon and collect @TrishDeseine 's 72% chocolate !
8:14 PM Nov 13th from TweetDeck

And then it was time to dash down to Hawksmoor to meet with the cook's team (who I'll be writing about in my next blog post), Niamh and Tom of Hawksmoor to go through the plan for Sunday. We came away feeling confident but there was still much to be done and last-minute supplies to source...

Saturday 14th November: woke up with butterflies in my stomach, a long list of things to do, more deliveries to accept and a ferocious storm outside:

last day before #blaggersbanquet - collecting 5kg of @TrishDeseine chocolate, baking 10 chocolate fondant cakes for tomorrow! Bring it on
12:36 PM Nov 14th from TweetDeck

My cup runneth over....have just taken delivery of many many kilos of organic butter for #blaggersbanquet Nom nom! Thanks to @sainsburys
12:59 PM Nov 14th from TweetDeck

Off to Mayfair I went to collect chocolatier Trish Deseine's 72 % chocolate for the banquet fondants, a recipe she had also kindly supplied. Let me tell you Mayfair is a curious place on a Saturday morning, virtually empty bar the occasional Bentley or Rolls. Was relieved to scamper back to safer realms of Bloomsbury where the Man was waiting for me, ready to bake:

RT @twilliams81: @scandilcious' flat helping with #blaggersbaquet. She is very bossy. Good job it fr charity ;0) (hah! bossy, me?)
2:53 PM Nov 14th from TweetDeck

Bossy! Well I can't complain, as I don't know many men who would sacrifice their Saturday afternoons to break big catering blocks of chocolate for making ten large fondants. In the process I tweeted:

loving @twilliams81 evident gusto in bashing up enormous block of @TrishDeseine 's chocolate for the fondants!
10:58 PM Nov 14th from TweetDeck

Yes, my Man is secretly a baker, he just doesn't know it yet

Not much sleep was had on Saturday night as there was still work to be done for Sunday, but you'll have to read about that tomorrow..

Hopefully this post has given a sense of what the preparation was like, and how down to the wire we were in many respects. It was a fascinating experience and we all learnt so much in a short period of time. I'll be posting thoughts on the banquet itself tomorrow, but in the meantime for a reviews have a peak at this one from food and drink writer Fiona Beckett here and see some fantastic photos capturing the event here
          Oil Supply Update        

So, that was a long blogging-break, no?

Sorry about that.  I spent the last semester teaching a new course at Cornell which sucked up all my energy.  I won't be teaching next semester, so things should be easier, but I will likely be supervising some Master's projects, as well as a restoration of my barn and an addition to my house (timber-frame, strawbale as previously studied and here implemented by Tugleywood Timberframing).  So I'm going to try to ease back into blogging on a weekly schedule for now (targeting Mondays).

The first few posts will probably be catching up.  I started by updating my global oil supply spreadsheets.  Nothing very dramatic happened in the last three months: supply continued to inch up, and prices are a little lower than during most of the last couple of years, but $100 remains an effective floor for Brent:


(Not zero-scaled).  The next picture gives close-up of the overall supply since the beginning of the great recession.  Supply was flattish in 2012 and early 2013, but then managed a lift of around a million barrels/day in the second half of 2013:


This last picture shows also (green line) the narrower definition of oil given by "Crude and Condensate", which has been flatter than the "all liquids" represented by the black line:


This reflects the fact that most of the growth in "oil" supply in the last decade was not actually oil but rather biofuels and natural gas liquids (which are substitutes for oil in some applications to varying degrees).

Overall, things in the oil markets are fairly stable and not threatening in the near term; increased US production from tight oil has offset declines elsewhere in the world.  However, I continue to think the situation is comparatively fragile in that there is very little spare capacity in OPEC, and so a major geopolitical disruption could easily cause a big price spike.  My favorite gauge of this is how far below the maximum-ever production Saudi Arabia is.  That looks as follows:


After a spike up to around 10mbd last summer, Saudi production is down by about 0.5mbd.  So that's probably most of the world's proven short-term spare capacity there.  Not very much in a 91mbd global supply.
          Comment on Renovation Update #6 – Do the work by Kaz        
Hi Paul, sorry but I don't have that spreadsheet available for download currently. Cheers,Kaz
          Comment on Renovation Update #6 – Do the work by Paul        
Hi There, We are about to start renovating our house, and I noticed that you used an excel spreadsheet for the budget. Is it at all possible to be able to download the spreadsheet somewhere? Regards
          Rounding time to nearest minute or quarter hour etc. [formulas]        
The other day, I was building a spreadsheet to calculate FTE (full time equivalent) for staff based on hours worked on various days in a fortnight. While building the spreadsheet, I came across an interesting problem. Rounding Time to nearest minute.  We can't use ROUND() or MROUND() to round time as these formulas aren't designed to work with time values. Although time values are technically decimal, rounding time to nearest minute (or quarter hour etc.) can be tricky when usual round formulas. Let me share a few formulas to round time to nearest point. Let's say you have a time value (either user input or calculated) in cell A1. Use below formulas to round time in A1.
          How to Do a Proper Post-Race Data Analysis        

I’m a data detective. I spend my days investigating clues in the TrainingPeaks dashboard, WKO charts, my “homemade” excel spreadsheets and even post-session comments from my athletes. Why spend so much time examining this information? It’s where the clues to achieving your best performance lie! Every time that my athletes (or I) race, I collect […]

The post How to Do a Proper Post-Race Data Analysis appeared first on TrainingPeaks.


          Reply #6065        
Nothing to report down here.

I am a bit more hopeful about the final SM prize being found, or at least some of those other big winners. I called Piggly Wiggly, and they said they have 10 tickets of 2MJ left. They couldn't tell me the book number over the phone.

Maybe we should make a google spreadsheet on SM, given it is at 200 books or less across the state at this point. I haven't had a chance to compare the book inventory reports, but I suspect there's quite a few retailers out t... [ More ]
          Miniature Golf Course Business Plan        
The Business Plan for Your Miniature Golf Course Create the documents and spreadsheets you need to manage your miniature golf course business.
          Franchise Operation Business Plan        
The Business Plan for Your Franchise Operation Create the documents and spreadsheets you need to manage your franchise business.
          The Business Plan for Your Small Business        
The Business Plan for Your Small Business Create the documents and spreadsheets you need to manage your small business.
          Air Conditioning and Heating Company Business Plan        
The Business Plan for Your Air Conditioning and Heating Company Create the documents and spreadsheets you need to manage your air conditioning and heating company.
          Arcade Operation Business Plan        
The Business Plan for Your Arcade Operation Create the documents and spreadsheets you need to manage your arcade business.
          Bowling Alley Business Plan        
The Business Plan for Your Bowling Center Create the documents and spreadsheets you need to manage your bowling alley.
          Beauty Salon Business Plan        
The Business Plan for Your Beauty Salon Create the documents and spreadsheets you need to manage your beauty shop.
          Reply #6058        
The latest for 2MJ

https://docs.google.com/spreadsheets/d/1_NwLOh1LJwBmJOuKVxINyZYEDY1nl1fUJ6JUmRG1sto/edit#gid=0

Current List of Active Books:

Piggy Wiggly 5355 Cotton Street, Graceville (1 Act) - This book is becoming the thing of legends. Somebody up in the Panhandle needs to check this out. For all we know it might be out of the jackpot range. But it's the only large grocery store with an active book left that they might actually have it. Give them a call. Might get someone that can... [ More ]
          Coin Operated Laundry Business Plan        
The Business Plan for Your Coin Operated Laundry Create the documents and spreadsheets you need to manage your self service laundry business.
          Dry Cleaning Service Business Plan        
Business Plan for Your Dry Cleaning Service Create the documents and spreadsheets you need to manage your dry cleaning business.
          Barbecue Grill Service Business Plan        
Business Plan for Your Barbecue Grill Service Create the documents and spreadsheets you need to manage your barbecue grill service business. Sell and install barbecue grills, barbecue islands, and fire pits.
          Concierge Service Business Plan        
The Business Plan for Your Concierge Service Create the documents and spreadsheets you need to manage your concierge service business.
          Self Storage Operation Business Plan        
The Business Plan for Your Self Storage Operation Create the documents and spreadsheets you need to manage your self storage operation.
          Golf Driving Range Business Plan        
The Business Plan for Your Golf Driving Range Create the documents and spreadsheets you need to manage your golf driving range business.
          Movie Theater Business Plan        
The Business Plan for Your Movie Theater Create the documents and spreadsheets you need to manage your movie theater operation.
          Hair Salon Business Plan        
The Business Plan for Your Hair Salon Create the documents and spreadsheets you need to manage your hair salon operation.
          Reply #6039        
The Report for 2MJ is updated and all I can say is WTF???????

https://docs.google.com/spreadsheets/d/1_NwLOh1LJwBmJOuKVxINyZYEDY1nl1fUJ6JUmRG1sto/edit?usp=sharing

There were no new locations to magically appear. But what really, really bothers me is that I was not expecting the VERY high number of retailers that had idle books of 2MJ since before Thanksgiving, well these have magically disappeared. And they didn't go anywhere else. Now I would LOVE to believe that LP forum folks and lurker... [ More ]
          Reply #6035        
Just got home from work. On the way home I stopped at Lucky's and got the last 4 tickets of roll 2MJ 245,237. 3 were losers, number 59 was a $10 winner. So that takes another roll out.

After dinner I will take down the spreadsheet and re-update it with the newest info.

Hope some of you have better luck than me. I'm 1 for 10 on my last 10 2MJ (a $10 win
          Reply #6033        
Hello to eveyone in this forum buying 2MJ and SM - using the spreadsheets Dracos just posted and comparing the remaining prizes $1000 and up on both dates here are the results:

1 - 4 - 17 report 1 - 30 - 17 report

Game 1228 Super Millions

1 - $3,000,000 remaining 1 - $3,000,000 remaining - no change

3 - $20,000 remaining 3 - $20,000 remaining - no change

8 - $10,000 remaining 8 - $10,000 remaining - no change

11 - $5,000 remaining 11- $5,000 remaining - no change

206 - $1,00... [ More ]
          Reply #6015        
Cool, I updated the spreadsheet. If I mark the locations in green, it shows as being Sold Out. Did these happen to be in the range . since these books didn't appear until January on the retailer list?

For the Publix Book, was this in regards to SM? I didn't see a Publix for North Port on the 2MJ list a/o the 19th

There's a couple 2MJ books a little further north in Bradenton/St Pete that I think other LP's might be researching.
          Reply #6003        
https://docs.google.com/spreadsheets/d/1_NwLOh1LJwBmJOuKVxINyZYEDY1nl1fUJ6JUmRG1sto/edit?usp=drive_web

Here ya go.

Are you going to venture into the infamous Jolly World, lol. Be safe.

A couple places show having received rolls, but the only active locations I see are

Kangaroo 1150 Ocean Shores Blvd, Ormond Beach (1) Active

Jolly World, 1027 Mason Ave, Daytona (1) Active

Kangaroo 1520 SR 40 Ormond Beach, Kangaroo 201 N Main St Daytona Beach, and Winn Dixie 1541 N... [ More ]
          Reply #5999        
Thanks for the input. I updated the spreadsheet.

There's a few other places on the list nearby you might want to try before going after the 187K book.
          Reply #5995        
@ zzplayfaster

Really nice work on the spreadsheet and hopefully this will help out those still chasing the remaining prizes in this game. Below is a list of remaining prizes from the Wednesday 1-4-17 IRL report. I think its helpful as a game nears its end to compare weekly or monthly IRL reports to see if the list below shrinks and decent prizes are claimed. This provides hope to everyone buying up these remaining tickets/books that good prizes ($1,000 and up) still exist out there. So as o... [ More ]
          Reply #5994        
I cleared through the Amoco Shop roll of 2MJ in Fort Lauderdale today, well at least the last 20 tickets or so. Pretty bad return, only $20 back.

Also, thank you everyone for the positive comments!

Gratz to Jackpotchasing for the nice wins on the $20 Flamingo!

@Zzplay - That's an excellent Google doc, thanks for your analysis and organizing a nice spreadsheet for everyone.

@ Everyone else experiencing meh returns, fingers crossed. Freespirit, I'm hoping you hit som... [ More ]
          Reply #5993        
Here's the 2MJ Updated Spreadsheet (hopefully doesn't get marked as spam) with the latest info I have.

https://docs.google.com/spreadsheets/d/1_NwLOh1LJwBmJOuKVxINyZYEDY1nl1fUJ6JUmRG1sto/edit#gid=0
          Reply #5868        
Continuing on the last post (and until I get confirmation Google Sheets link won't be marked as spam) here is some relevant 2MJ info. Be interested to hear from Zeb if he thinks the Jackpot was released prior to Draco's report from 1 Dec. If not, then this spreadsheet possibly identifies all the last new books released from the Warehouse.

Here's what I got:

Currently 106 Locations show as having 2MJ as of the Report on 19 January. I did a search using the 19th's report to identify what ven... [ More ]
          Reply #5867        
Are we allowed to share Google Docs links or is that spam?

I have created a very interesting spreadsheet for those of you chasing 2MJ. What I did was compare the reports Dracos provided from 1 Dec, 6 Jan, and 19 Jan and looked for changes in retailers and books sold. As expected there's a lot of locations with received books from 1 Dec that are still showing the same thing with no change leaving it up in the air if they really have these books or not.

HOWEVER, I found some quite useful tr... [ More ]
          JPMorgan Flaws Should Ring Alarm Bells Everywhere        

JPMorgan was supposed to be among the best managers of bank risk in the world. This week it published an internal report into the failings which led to $6.2 billion of trading losses at its chief investment office in 2012. If the mix revealed – conflicting mandates, discredited theory, inadequate checks and primitive technology – is really as good as it gets, financial watchdogs and investors everywhere should worry. There are plenty of lessons for regulators and bank executives who want things done right.

First, the controls should match the mission of a unit that manages excess cash, as the CIO did, and is trying to make money in the process. The report suggests JPMorgan’s supervision was set for the days when the CIO was a sleepier and much smaller operation which engaged in simple, old-fashioned hedging. Not enough changed when the CIO morphed into a trading operation that was a force in the market for complex synthetic credit default swaps. One trader was nicknamed the London Whale in press reports.

In particular, the CIO was under pressure to minimize reported risk. Models are used to calculate the measures of risk: risk-weighted assets (RWA), used to calculate capital strength, and value-at-risk (VaR), used to estimate likely losses. When the models suggested that some assets should be sold to keep the risk at an acceptable level, the unit’s traders sometimes just tried to change the models. But the bank’s senior risk staff did not summon up the necessary skepticism.

Second, the CIO’s VaR models relied on flawed theory. Credit default swaps simply don’t behave in line with the normal, or Gaussian, distribution typically assumed. The so-called tail risks, or the chances of extreme events, are bigger than that theory predicts. Ina Drew, who ran the CIO, referred to one day’s mark-to-market losses as an eight standard deviation event, according to the report. That translates mathematically into something that should only happen once every several trillion years. It makes no more sense than Goldman Sachs finance chief David Viniar’s famous remark as the crisis unfolded in 2007 about seeing 25 standard deviation events, several days in a row.

Third, not only was the VaR approach flawed, but JPMorgan did not calculate VaR correctly. At least one model was stored on Excel spreadsheets with formulae that hadn’t been properly checked. Inputs were supposed to be updated manually, but some numbers were out of date.

Someone in another part of JPMorgan even spotted an error in the CIO’s VaR calculation. For the person in question to take the time to identify the mistake, the VaR number must have looked badly off-kilter. Yet CIO managers dismissed it as a one-off. At best, the CIO’s systems last year look alarmingly amateurish. At worst, the managers didn’t understand or care much about the risks. Either way, these problems provide definitive talking points for regulators examining other companies.

Next comes the pricing of illiquid positions. It’s a fact of a financial firm’s life that when there is room for judgment or different answers, it is in a trader’s interest to provide the rosiest possible numbers. That clearly happened in the CIO. It did not follow best industry practice of asking outsiders to provide prices for illiquid positions. Managers should not rely on traders themselves, as the CIO largely did – but JPMorgan is surely not the only institution to do so.

Illiquid assets are also hard to shift, and for the CIO’s massive trades even accurate models would not have been reliable indicators of actual market prices. Other traders and financial media were already picking up on the CIO’s huge positions, but Jamie Dimon, the bank’s chief executive, agreed that these rumblings were just “a tempest in a teapot.” It was a big teapot: certain derivative positions may have had a nominal value of as much as $10 trillion, although there were other offsetting trades. Small mistakes about market prices could, and did, lead to large losses.

Dimon and his subordinates understandably wanted to believe that everything was indeed under control; unfortunately they believed it without first asking enough hard questions.

Lastly, JPMorgan’s report concludes that its compensation system wasn’t a problem. But it looks as though the incentive system didn’t help. Some traders were reluctant to unwind positions fully because it would crystallize losses, which would in turn normally hurt bonuses. Drew did nothing to reassure them. In any case, for bank traders, closing positions feels like housekeeping, which is never as well paid as making profitable trades.

Bank bosses and regulators alike need to ensure that measuring, managing and minimizing risk is part of the culture of any trading room. JPMorgan’s whale debacle shows how not to do it. Dimon has taken steps designed to change things. With luck, the lessons will help other banks avoid similar troubles.  

Read more at Reuters Breakingviews.


          How the 47% Beat the Super PACs        

This U.S. election provided a valuable math lesson for those worried about the consequences of income inequality: the 47 percent of the population dismissed by Mitt Romney during his campaign can wield greater power than the richest 1 percent. 

Ultra-wealthy Republican candidates included professional wrestling maven Linda McMahon, who ran for a Senate seat in Connecticut. Rich GOP supporters like the billionaire Koch brothers also worked spreadsheets hoping to somehow reverse the algebraic reality. They learned the hard way that money can’t buy everything in America.

The Occupy movement’s 1 percent label distinguished the haves from the have-nots. And Romney, for many the personification of that financially elite group, inadvertently provided the 47 percent reference. Roughly speaking, he quantified that mostly Democrat proportion of the U.S. population as irredeemably reliant on the government.

True to form, the rich put their lucre to work politically. About $1 billion was deployed through Oct. 17 in the effort to elect Romney, 10 percent more than for Barack Obama, according to the Center for Responsive Politics, a political contribution research group. Romney received twice as much backing from Super PACs, the new organizations that can raise limitless funds from individuals and corporations. Wall Street gave three times more to the former Bain Capital boss than to Obama.

Candidates with more money won many races and Democrats had wealthy backers, too. But the basic equation held. Romney raised nearly $200 million from contributions of at least $2,000 apiece - almost twice Obama’s haul for checks that size, according to the Federal Election Commission. The president, by contrast, pulled in over $400 million from donations of $200 or less, quadruple Romney’s receipts at that level. The polls reflected the disparity. Nearly six of 10 voters earning under $50,000 a year chose Obama.

The estimated $6 billion spent on all races combined attests to the unseemly influence of money in U.S. politics. And cash could matter even more as companies and well-heeled donors find more effective ways to use Super PACs. That demands vigilance from all citizens. Tuesday’s results, however, show how simple the math can be and that votes are often a great equalizer. 


          Comment on How THIS Blogger Stays Organized by gretchen        
My grocery stores don't have that awesome feature, but I made a grocery list spreadsheet of what we buy, sorted by aisle. I asked the store for a copy of their stocking map to create it. Saves time, especially when the kids come. No impulse buying either.
          Comment on How is the VSPP / vCAN Usage Calculated? by Jonas        
Hi! May I also have a copy of your spreadsheet? Thanks!
          iWork: The changes in Apple's productivity suite        

This week brought a big update to iWork, the iOS, OS X and iCloud productivity suite made up of Pages, Keynote, and Numbers. You may have already grabbed the updates for iOS and OS X from the respective App Stores and not noticed too much of a visual change to the apps, but here's what has changed.

iCloud Versions

Let's take a quick look at the iCloud version of the apps. All of the apps now feature Retina display-ready graphics that really look impressive on a MacBook Pro with Retina display. While I must confess to not having used the previous versions of the beta iCloud apps very much, it does appear that Apple has attempted to make the app look very similar to the iOS apps.

Those documents can now be shared with others in a view-only mode, making it easy to let someone see the latest revision of a document without giving them full access to make changes. For new documents, the apps have additional templates that have been added to the mix. If anyone sends you a Pages, Keynote or Numbers document via iCloud Mail, you can now open that document directly in the iCloud version of the app -- the email features an "Open in Pages/Keynote/Numbers" link making it easy to get right to work.

Pages

The Mac version of Pages now allows users to delete, duplicate, and reorder sections of their documents using the page navigator, and copying and pasting styles has been improved a bit. Apple says that they've improved Instant Alpha editing of images, although I saw no variation in the way that function works. The Media Browser is improved, although still not exactly speedy.

I did see vastly improved support for AppleScript in Pages 5.2. That's something that power users have been asking for since Pages 5.0, and the addition of an iWork Suite of commands appears to bring back most of the functionality that was available in previous versions of Pages. That suite is available for all of the iWork apps.

Apple says that they've improved text box behavior, although I was unable to ascertain exactly what was different from previous versions. There's improved support for EndNote, including citations in footnotes, and for those who are using Pages for ebook creation, ePub export is allegedly better.

The iOS version now lets you search documents by name -- previously, you could only browse documents in a list or thumbnail mode. Inline images and shapes in table cells are now preserved properly when you import a document or table, and placement of inserted and pasted objects now seems to work better. If you write in Hebrew, you'll be glad to know that there's now a word count feature for that language, and all in all the app seems somewhat more usable (especially on iPad) than previously.

Keynote

Apple's presentation app gained some new features on iOS, including one that I am already in love with -- you can now use your finger to draw on any of your slides by just tapping and holding. A "crayon box" of pencils shows up at the bottom, along with the familiar "laser pointer". It's now possible to hold your iPad in portrait mode while giving a presentation thanks to a new portrait layout option in the presenter display. A couple of new transitions and builds -- object revolve, drift and scale, and skid -- have been added, and animations just seem to be much smoother than before.

The Mac version adds some fun features in addition to those found in the iOS version -- there are improved presenter display layouts and labels, and Magic Move now includes text morphing. The app now exports to PPTX format, and there's support for animated GIFs being pasted or imported into presentations.

Numbers

This is the part of Apple's productivity suite that I probably use the least, both on iOS and Mac. Some of the big changes to the iOS version include the ability to search spreadsheets by name and faster imports of CSV (comma-separated text) files, as well as improved compatibility with Microsoft Excel documents.

The Mac version adds the ability to set margins and create headers and footers in print setup, and there are new printing options that include page numbering, page ordering, and zoom. If you want custom data formats, you can now create them in Numbers. Customization of table styles is also added. And remember those CSV improvements in the iOS version? Now you can drag and drop a CSV file right onto a sheet, or update an existing table by dragging in a CSV file.


          Guest post: It's 2016 and your data aren't UTF-8 encoded?        

Bob mesibovThe following is a guest post by Bob Mesibov.

According to w3techs, seven out of every eight websites in the Alexa top 10 million are UTF-8 encoded. This is good news for us screenscrapers, because it means that when we scrape data into a UTF-8 encoded document, the chances are good that all the characters will be correctly encoded and displayed.

It's not quite good news for two reasons.

In the first place, one out of eight websites is encoded with some feeble default like ISO-8859-1, which supports even fewer characters than the closely related windows-1252. Those sites will lose some widely-used punctuation when read as UTF-8, unless the webpage has been carefully composed with the HTML equivalents of those characters. You're usually safe (but see below) with big online sources like Atlas of Living Australia (ALA), APNI, CoL, EoL, GBIF, IPNI, IRMNG, NCBI Taxonomy, The Plant List and WoRMS, because these declare a UTF-8 charset in a meta tag in webpage heads. (IPNI's home page is actually in ISO-8859-1, but its search results are served as UTF-8 encoded XML.)

But a second problem is that just because a webpage declares itself to be UTF-8, that doesn't mean every character on the page sings from the Unicode songbook. Very odd characters may have been pulled from a database and written onto the page as-is. In ALA I recently found an ancient rune — the High Octet Preset control character (HOP, hex 81):

http://biocache.ala.org.au/occurrences/6191ca90-873b-44f8-848d-befc29ad7513http://biocache.ala.org.au/occurrences/5077df1f-b70a-465b-b22b-c8587a9fb626

HOP replaces ü on these pages and is invisible in your browser, but a screenscrape will capture the HOP and put SchHOPrhoff in your UTF-8 document.

Another example of ALA's fidelity to its sources is its coding of the degree symbol, which is a single-byte character (hex b0) in windows-1252, e.g. in Excel spreadsheets, but a two-byte character (hex c2 b0) in Unicode. In this record, for example:

http://biocache.ala.org.au/occurrences/5e3a2e05-1e80-4e1c-9394-ed6b37441b20

the lat/lon was supplied (says ALA) as 37°56'9.10"S 145° 0'43.74"E. Or was it? The lat/lon could have started out as 37°56'9.10"S 145°0'43.74"E in UTF-8. Somewhere along the line the lat/lon was converted to windows-1252 and the ° characters were generated, resulting in geospatial gibberish.

When a program fails to understand a character's encoding, it usually replaces the mystery character with a ?. A question mark is a perfectly valid character in commonly used encodings, which means the interpretation failure gets propagated through all future re-uses of the text, both on the Web and in data dumps. For example,

http://biocache.ala.org.au/occurrences/dfbbc42d-a422-47a2-9c1d-3d8e137687e4

gives N?crophores for Nécrophores. The history of that particular character failure has been lost downstream, as is the case for myriads of other question marks in online biodiversity data.

In my experience, the situation is much worse in data dumps from online sources. It's a challenge to find a dump without question marks acting as replacement characters. Many of these question marks appear in author and place names. Taxonomists with eastern European names seem to fare particularly badly, sometimes with more than one character variant appearing in the same record, as in the Australian Faunal Directory (AFD) offering of Wêgrzynowicz, W?grzynowicz and Węgrzynowicz for the same coleopterist. Question marks also frequently replace punctuation, such as n-dashes, smart quotes and apostrophes (e.g. O?Brien (CoL) and L?Échange and d?Urville (AFD)).

Character encoding issues create major headaches for data users. It would be a great service to biodiversity informatics if data managers compiled their data in UTF-8 encoding or took the time to convert to UTF-8 and fix any resulting errors before publishing to the Web or uploading to aggregators.

This may be a big ask, given that at least one data manager I've talked to had no idea how characters were encoded in the institution's database. But as ALA's Miles Nicholls wrote back in 2009, "Note that data should always be shared using UTF-8 as the character encoding". Biodiversity informatics is a global discipline and UTF-8 is the global standard for encoding.

Readers needing some background on character encoding will find this and especially this helpful, and a very useful tool to check for encoding problems in small blocks of text is here.


          Comment on Throwing Water on Our FIRE! by Eddy G        
Sounds like you are a worrier - like me! Also in Texas and on the fence. My spreadsheet says I am ready to FIRE but am not certain if I'll have major regrets in a few years - especially if the market turns. Good to hear there are others in the same boat.
          Branch Office Administrator        
FL-Lakeland, Job Objective: Jasper Contractors is seeking a full-time dynamic, friendly and organized entry level Administrative Assistant to perform office support activities for multiple supervisors. As an Admin Assistant your role is to apply for permits, track data on spreadsheets, and communicate to the team by phone & email. Jasper Contractors is a fast growing company with many opportunities for advance
          IT audit trail, real time audit, IT forensics        
IT Audit Trail
Audit Trail merupakan salah satu fitur dalam suatu program yang mencatat semua kegiatan yang dilakukan tiap user dalam suatu table log. Sedangkan IT Audit Trail adalah suatu bagian yang ada dalam program yang dapat mencatat kegiatan-kegiatan audit yang secara rinci dilakukan oleh para penggunanya.
Audit Trail secara default akan mencatat waktu, user, data yang diakses dan berbagai jenis kegiatan. Jenis kegiatan ini dapat berupa menambah, mengubah, dan menghapus. Audit Trail apabila diurutkan berdasarkan waktu dapat membentuk suatu kronologis manipulasi data. Dengan adanya Audit Trail ini, semua kegiatan dalam program yang bersangkutan diharapkan dicatat dengan baik.
Cara kerja Audit Trail, yaitu:
  1. Audit Trail yang disimpan dalam suatu table:
  • Dengan menyisipkan perintah penambhan record ditiap query Insert, Update, dan Delete.
  • Dengan memanfaatkan fitur trigger pada DBMS. Trigger adalah kumpulan SQL Statement yang secara otomatis menyimpan log pada event INSERT, UPDATE, dan DELETE pada sebuah table.
  • Fasilitas Audit Trail diaktifkan, maka setiap transaksi yang dimasukkan ke Accurate, jurnalnya akan dicatat di dalam sebuah table, termasuk oleh siapa, dan kapan. Apabila ada sebuah transaksi yang di-edit, maka jurnal lama akan disimpan, begitu pula dengan jurnal baru.
   2. Hasil Audit Trail akan disimpan dalam bentuk, yaitu:
·   Binary File: Ukuran tidak besar dan tidak bisa dibaca begitu saja.
·   Text File: Ukuran besar dan bisa dibaca langsung.
·   Table.
Real Time Audit
Real Time Audit atau RTA adalah suatu system untuk mengawasi kegiatan teknis dan keuangan sehingga dapat memberikan penilaian yang transparan status saat ini dari semua kegiatan, dimana pun mereka berada. Real Time Audit mendukung semua langkah dari satu proyek dari konsep, mempersiapkan satu usulan penuh, melakukan analisa putusan untuk mengidentifikasi jual system final sehingga ketika untuk memilih proyek terbaik manajemen hak suara kemudian dukungan pembuatan keputusan pada penerimaan atau merosot untuk membuat investasi perlu.
Dalam pengembangan proyek Real Time Audit berfungsi sebagai analisis karena untuk memastikan bahwa kualitas benar, dan berkualitas. Real Time Audit mempunyai kegunaan pengadaan tersesialisasi yaitu dengan memperbolehkan seorang manajer meniliti tawaran bersaing untuk menyediakan baik jasa maupun komponen proyek.
Real Time Audit meneydiakan teknik ideal untuk memungkinkan mereka yang bertanggung jawab untuk dana, seperti bantuan donor, investor dan sponsor kegiatan untuk dapat terlihat dari manajer kegiatan didanai sehingga untuk memantau kemajuan.
Real Time Audit sangat efektif untuk membangun procedure menjadi perjanjian pembiayaan meliputi proyek atau kegiatan yang bersangkutan. Real Time Audit menyediakan komponen utama yang diperlukan untuk efektif, kegiatan pengelolaan yang efisien dan pengawasan.
Real Time Audit benar-benar transparan dan menyediakan operasi proyek manajer dan donor/sponsor akses langsung informasi apapun yang mereka butuhkan secara online dan cepat. Manfaat Real Time Audit yaitu produktivitas akses informasi ditingkatkan dan sebagai hasilnya jadi jika produktivitas tugas manajemen proyek.
Apa  IT Forensics
Definisi sederhana, yaitu penggunaan sekumpulan prosedur untuk melakukan pengujian secara menyeluruh suatu sistem komputer dengan mempergunakan software dan tool untuk memelihara barang bukti tindakan kriminal.
Menurut Noblett, yaitu berperan untuk mengambil, menjaga, mengembalikan, dan menyajikan data yang telah diproses secara elektronik dan disimpan di media komputer.
Menurut Judd Robin, yaitu penerapan secara sederhana dari penyidikan komputer dan teknik analisisnya untuk menentukan bukti-bukti hukum yang mungkin.
Sehingga, dapat disimpulkan bahwa IT Forensics adalah Ilmu yang berhubungan dengan pengumpulan fakta dan bukti pelanggaran keamanan sistem informasi serta validasinya menurut metode yang digunakan (misalnya metode sebab-akibat).
Mengapa Menggunakan IT Forensics
  • Dalam kasus hukum, teknik komputer forensik sering digunakan untuk menganalisis sistem komputer milik terdakwa (dalam kasus pidana) atau milik penggugat (dalam kasus perdata).
  • Untuk memulihkan data jika terjadi kegagalan atau kesalahan hardware atau software.
  • Untuk menganalisa sebuah sistem komputer setelah terjadi perampokan, misalnya untuk menentukan bagaimana penyerang memperoleh akses dan apa yang penyerang itu lakukan.
  • Untuk mengumpulkan bukti untuk melawan seorang karyawan yang ingin diberhentikan oleh organisasi.
  • Untuk mendapatkan informasi tentang bagaimana sistem komputer bekerja untuk tujuan debugging, optimasi kinerja, atau reverse-engineering.
Kapan mulai digunakan IT Forensics
Pada tahun 2002 diperkirakan terdapat sekitar 544 juta orang terkoneksi secara online. Meningkatnya populasi orang yang terkoneksi dengan internet akan menjadi peluang bagi munculnya kejahatan komputer dengan beragam variasi kejahatannya. Dalam hal ini terdapat sejumlah tendensi dari munculnya berbagai gejala kejahatan komputer, antara lain:
a. Permasalahan finansial. Cybercrime adalah alternatif baru untuk mendapatkan uang. Perilaku semacam carding (pengambil alihan hak atas kartu kredit tanpa seijin pihak yang sebenarnya mempunyai otoritas), pengalihan rekening telepon dan fasilitas lainnya, ataupun perusahaan dalam bidang tertentu yang mempunyai kepentingan untuk menjatuhkan kompetitornya dalam perebutan market, adalah sebagian bentuk cybercrime dengan tendensi finansial.
b. Adanya permasalahan terkait dengan persoalan politik, militer dan sentimen Nasionalisme.
Salah satu contoh adalah adanya serangan hacker pada awal tahun 1990, terhadap pesawat pengebom paling rahasia Amerika yaitu Stealth Bomber. Teknologi tingkat tinggi yang terpasang pada pesawat tersebut telah menjadi lahan yang menarik untuk dijadikan ajang kompetisi antar negara dalam mengembangkan peralatan tempurnya.
c. Faktor kepuasan pelaku, dalam hal ini terdapat permasalahan psikologis dari pelakunya.
Terdapat kecenderungan bahwasanya seseorang dengan kemampuan yang tinggi dalam bidang penyusupan keamanan akan selalu tertantang untuk menerobos berbagai sistem keamanan yang ketat. Kepuasan batin lebih menjadi orientasi utama dibandingkan dengan tujuan finansial ataupun sifat sentimen.
  • Elemen penting dalam penyelesaian masalah keamanan dan kejahatan dunia komputer adalah penggunaan sains dan teknologi itu sendiri. Dalam hal ini sains dan teknologi dapat digunakan oleh fihak berwenang seperti: penyelidik, kepolisian, dan kejaksaan untuk mengidentifikasi tersangka pelaku tindak kriminal.
  • Bukti digital (Digital Evidence) merupakan salahsatu perangkat vital dalam mengungkap tindak cybercrime. Dengan mendapatkan bukti-bukti yang memadai dalam sebuah tindak kejahatan, Bukti Digital yang dimaksud dapat berupa adalah : E-mail, file-file wordprocessors, spreadsheet, sourcecode dari perangkat lunak, Image, web browser, bookmark, cookies, Kalender.
Ada 4 Elemen Forensik:
1. Identifikasi bukti digital
2. penyimpanan bukti digital
3. analisa bukti digital
4. presentasi bukti digital
Siapa yang menggunakan IT Forensics Auditor dan Dokter-komputer-forensik: penuh dengan tanggungjawab dan harus independen, diasses secara formal.

          Tom Wakely for Governor        
I have my candidate.  How about you?


A month ago, at the top of this (somewhat depressing, reality-driven) post about the current market value of voting, especially in our beloved Texas, I quoted Wakely's mention in the excellent (mostly progressive Democratic) blog Down With Tyranny and his successful (if you consider limiting Lamar Smith to just under 57%) effort to win a Central Texas seat in Congress last year.

Wakely is everything you'd expect in a seasoned white progressive populist.  He's a reincarnation of my old pal David Van Os, with less picante.  He's Bernie Sanders with a cowboy hat.  Like Bernie, he may eventually find a little traction among millennials, people of color, women, and others who want to see a different and better Texas, but without much in the way of a website at the moment (update: better website) or money flowing to his campaign he is likely going to be relegated to Green Party-like numbers.

That doesn't matter as far as I'm concerned.  Zack Lyke, who managed his campaign against Lamar Smith in TX-21, also ran John Courage's successful San Antonio city council effort.  So let's hope he does the same again for Wakely, at least until the candidate rises in the polls and the money starts rolling in and the thousand-dollar Italian-suited Democratic political consultants try to push him out.

Since Wakely has castigated the Democratic Party so harshly, I'm still thinking Democrats' only hope is to get behind a Draft Joe Straus effort.  But if they pick a Clintonite, I think it's going to be fun to watch how much farther than Wendy Davis that person falls.  Remember, no more straight ticket voting unless a court rules otherwise and the decision stands for next year  (see Update IV immediately below).  My humble O is that hurts Democrats electorally, but does not rise to the level of discrimination.  I'm not a judge, though.

Update IV (7/21, and time for a new post on this topic): Kuff's opinion about Wakely, to put it mildly, lacks enthusiasm for the candidate, knocking down Wakely's contention about being the highest vote-collector among Congressional Democratic challengers to Republican incumbents with some numbers from one of his trusty spreadsheets, and correcting me about when the voter ID law eliminating straight-ticket voting is to take effect.

Correction duly noted, but that first part seems a little "to-may-toe, to-mah-toe" to me.

Update: Oh looky here.  Jeffrey Payne, small business owner, is in a same-sex marriage so the Dallas Voice helped him out, but Stace doesn't think much of his language on the anti-sanctuary law, and the TDP and Kuff seem less than enthused.  Lacks a website, name recognition, money, donors, etc. like every other Democratic/Green/progressive independent candidate so there's that.

Update III (7/20): I now understand the Democrats' reticence to get behind (sorry) "International Mr. Leather" for Governor.

Update II: In response to my query as to party affiliation or lack thereof, Wakely tweeted the following back at me this morning.


Seems a little conflicted, and certainly his previous statements about Texas Democrats are going to be held against him, but as he says ... wait and see.

Wakely posted a fairly lengthy announcement at Down With Tyranny.  Here it is (bold emphasis is mine, with a few minor English-teacher-style corrections.  Hey, everybody needs a proof reader, including me).

My name is Tom Wakely and I am a candidate for Governor of Texas but before we get into that I’d like to tell you a story or two of how I came to the decision to run and why I am running.

My wife and I run a private care home for hospice patients in San Antonio. We offer them a place in our home to die. We have been doing this for a little over eight years now and we have helped 48 people to die with dignity and respect. My wife is from Mexico and she just became a US citizen this past spring. I was born in San Antonio and right after high school at the age of 17, I enlisted in the U.S. Air Force. After my discharge, I returned to San Antonio and soon found myself working with Cesar Chávez on the grape boycott campaign in Texas. I was very political active during the 70’s. But all my work, all my idealism, came crashing down in the fall of 1980 with the election of Ronald Reagan as President. I was 27 years old. I was lost, bewildered, not sure what I should do or where I should do it. For the next few years, I was an aimless wanderer, traveling all over the country. I tried going back to college but it didn’t hold my interest. I became a stockbroker for a while but decided that wasn’t my thing. I flipped burgers for a few months, drove a cab, I even took a job on the Mississippi cleaning the inside of river barges. I eventually drifted back down to Texas with nothing more than a few bucks in my pocket. The political revolution that I had been so much a part of had failed and nothing I did or thought I wanted to do could fill the emptiness in my soul.

I was now 32 years old, alone and tired. I had just broken my right ankle in a stupid accident and as I lay in my hospital bed, thinking about my life, I had an epiphany. Of course, at the time, I didn’t understand it to be an epiphany, but it was. Anyway, I healed and after a few phone calls followed by a few interviews, I found myself back in the Midwest, enrolled at the Chicago Theological Seminary.

I mention this because it is relevant to why I am running for Governor of the State of Texas.

Now, you have to remember when I entered seminary in 1985, apartheid was still the political and social system in South Africa. I had been and was now again very active in the anti-apartheid movement, this time in Chicago. To my surprise my seminary had a relationship with a South African seminary and with Desmond Tutu. It worked like this. When it looked liked an ANC fighter, a college professor or a shopkeeper was about to be arrested and imprisoned by the authorities, they were immediately whisked away, enrolled in the South African seminary, then within a few days, they were in Chicago, in class with me. I became friends with all of them and still maintain that friendship with a few of them today.

These men taught me two very important life lessons. The first was don’t give up hope. Don’t let losing a battle discourage you. Keep on resisting. The second lesson was when a political opportunity presents itself, grab on to it as you may not get another chance. Which brings me to the question that everyone is asking me. Why in the hell are you running for Governor of Texas.

To answer that question, you need to understand Texas politics. Texas is not a red state; it is a no-vote state. In the 2014 general election that saw the rise of Greg Abbott and his tea-party brethren to power -- Abbott becoming Governor and Dan Patrick becoming Lt. Governor -- only 38% of our state’s registered voters voted. Abbott took 60% of the 38% which means he only received the support of 22% of the state’s registered voters. The Democratic candidate, Wendy Davis, did far worse. She only received 40% of the 38% which means just a little over 15% of the state’s registered voters supported her. This voting pattern has been fairly consistent and repeated over and over again for decades.

Something is terribly wrong here. Over 60% of my state’s registered voters are consistently not voting.

The result is that Texas is now controlled by a small minority of politically, socially and religiously conservative people and the Texas Democratic Party has no clue what to do about it, or if they do know what the problem is, they simply have chosen to ignore it. We all know there is still a hell of a lot of money to be made losing elections.

Over the past 4 months I have talked to literally thousands of non-voters all across the state and asked them why they didn’t vote and they all have told me basically the same thing: they don’t vote because they know the Republican Party doesn’t care about them or their family and the Democratic Party has abandoned them. That is the reason why over 60% of our state is not voting: they know that neither party cares about the working men and women of this state. To paraphrase Bernie Sanders, Texas cannot survive morally or economically much longer when so few have so much and so many have so little.

I entered a 2016 Congressional race here in Texas on the heels of Bernie’s bid to secure the Democratic Party nomination for President. I ran as an economic populist on a strong bold progressive agenda against 30-year Republican incumbent Lamar Smith, the Chair of the House Committee on Science, Space and Technology. While we lost that race, we did manage to secure a few moral victories. We received more votes than any candidate who had ever run against Smith; we managed to drop Smith’s percentage of the vote total to its lowest level ever -- 56.9% -- and our campaign received more votes than any Democrat in the State of Texas running against a incumbent Republican member of Congress. And we did all that with no institutional support from the Texas Democratic Party and with very little money -- a tad over $70,000, which included a $15,000 loan I made to my campaign.

Like my old ANC friends from seminary said, when a political opportunity presents itself, grab on to it as you may not get another chance. Well, the political opportunity in Texas is now. Governor Greg Abbott’s attacks on labor, on women, on refugees and immigrants, on Hispanics and other minorities, on the LBGTQ community, on the poor in our state, on our environment and on our great cities, needs to be responded to with the most forceful weapon we have at our disposal - the ballot box.

I am entering this race for Governor not because I want to but because I have to. When I was in seminary I learned about Martin Niemöller, the Lutheran minister who was an outspoken public foe of Adolf Hitler and who spent the last seven years of Nazi rule in concentration camps. He summed up perfectly my feelings and why I am running for Texas Governor. Niemöller said: “First they went after the Communists, and I did not stand up, because I was not a Communist. Then they went after the homosexual and infirm, and I did not stand up, because I was neither. Then they went after the Jews, and I did not stand up, because I was not a Jew. Then they went after the Catholics, and I did not stand up, because I was Protestant. Finally, they went after me, and there was no one left to stand up for me.”

As far as I am concerned, the defining principles of the 2018 Texas Governor’s race are moral issues: respect for the dignity of everyone living in Texas; respect for the dignity of work and the rights of workers; the call to family and to community; the rights and responsibilities of all Texans; a preferential option for the poor in our state; valuing our fellow Texans and respecting who they are as individuals; and caring for God’s creation -- the air, water and land.

As I mentioned above, with only a little more than 22% of the state’s registered voters supporting Governor Abbott and his tea party brethren, I again have to ask myself why are over 60% of our state’s registered voters not voting. The answer I believe to why so many Texans are not voting is because no serious candidate for Governor has ever talked to them about income inequality. Well, I intend to talk to the 60% about income inequality in our state. Look, Texas has the world’s 12th-largest economy but we rank 8th among the states as far as income inequality goes. San Antonio, the 7th largest city in the country, my home, ranks # 1 in income inequality. If we are serious about reducing income inequality in Texas we need to make it easier for people to join a union, not harder, and that is why I support repealing our state’s right to work laws. I also support raising Texas’s minimum wage to $15 an hour. It’s a start.

I will also talk to the 60% about how a person can be a strong supporter of the 2nd Amendment and at the same time support common sense stuff like background checks at gun shows. Advocating for gun violence prevention programs is in no way, no how, inconsistent with being a 2nd Amendment supporter.

I will attempt to explain to the 60% why abolishing the death penalty in our state makes sense. Look, I understand if someone killed a friend or family member of mine, I would want vengeance as surely as the next man would. But I refuse to give that power to the state. Texas has already executed at least 2 innocent men over the past decade. A mistake that can’t be undone. If you look at the death penalty strictly from an economic perspective, the death penalty system is much more expensive than sentencing inmates to life imprisonment. The cost for sentencing a person to life costs the Texas taxpayer about $700,000 vs. sentencing someone to death, including court appeals, can easily run over $2 million. Besides, nothing could be worse than spending your life in an 8’ x 6’ cell.

I also want to reach out to the 60% and ask them a simple question: does your child or grandchild have asthma. I will point to the fact that Bexar County, home to San Antonio, leads the state in the number of children hospitalized for asthma. I will tell them that Texas is the number one source of oil and gas methane pollution in the country. I will tell them that is why I want to ban fracking and flaring in our state. I will also tell them we can create tens of thousands of new jobs by moving our state from a fossil fuel economy to a renewal energy economy.

Today, like in many states, the number one issue is rising personal property taxes to fund our public school systems. So, I want to know why Governor Abbott is not supporting a proposal by a colleague of his to abolish school property taxes altogether and find new revenue streams. Well, I think it is a great idea and among the dedicated revenue streams I see available to us are scrapping my state’s complicated franchise tax system and replacing it with a business income tax. I also support the legalization and taxation of marijuana in Texas. The revenue from both sources would put a serious dent in the funds needed to make Texas public schools number one in the nation and at the same time lowering the personal property taxes that so many of us are suffering to pay each year. (Side note here, last year, my wife and I had to pay our tax bill with a credit card.)

I want everyone to know I am running for Governor because I want to make Texas Great. After decades of abuse, the women and children of our state need someone to stand up for them. When I see so many Texans hurt and killed by senseless gun violence, they need someone to stand up for them. When I see racist legislation like the anti-sanctuary city bill SB 4 signed into law by Governor Abbott and his support of Lt. Governor’s Dan Patrick’s effort to regulate bathroom use by transgender people in public buildings, I cringe with fear. Fear of the words that Martin Niemöller spoke so long ago: ‘Finally, they went after me, and there was no one left to stand up for me.”

          FY2015 HUD CoC Program Scoring Criteria Summary and Score Estimating Worksheet        

This excel spreadsheet is a tool to spark community discussions about 2015 NOFA and CoC performance. It outlines each area CoCs will be scored on as a part of the consolidated application and allows them to estimate how many points they will receive in each category. The tool is also meant to encourage awareness for CoC leadership, CoC Boards, and all CoC-funded projects regarding the importance of having a performance-based review and ranking process for all projects that includes reallocation, making progress on the goals in Opening Doors and HUD’s policy priorities, and strengthening system-wide CoC coordination and engagement.  These points are summarized in charts and graphs. The spreadsheet is designed to be easily printed and shared if communities are simply looking for a summary of points to share.


          Goodbye, Lotus 1-2-3        
"The first killer app was VisiCalc. This early spreadsheet turned the Apple II from a hobbyist toy to a business computer. VisiCalc came with room for improvement, though. In addition, a new architecture and operating system, the Intel-based IBM PC and MS-DOS, also needed a spreadsheet to be taken seriously. That spreadsheet, released in early 1983, would be Lotus 1-2-3, and it would change the world. It became the PC's killer app, and the world would never be the same. On May 14, IBM quietly announced the end of the road for 1-2-3, along with Lotus Organizer and the Lotus SmartSuite office suite. Lotus 1-2-3's day is done." Impressive 30 year run.
          Bottom Dwellers About To Move North As At April 26, 2017        


This list results from a scan looking for stocks that have been drifting and are about to turn higher.

I did this scan many times on prior days to see what develops once they make this list..

Below are some charts from today's list, you will see that they all have started to move higher.














Please note that these scans are not suitable for trading advice. Each of these scans is in a development phase, they are being refined as each day passes.
Watchlists are being studied and refinements made, until there is some good degree of reliability, these scans are unsuitable except to record the scans.




          Bottom Dwellers About To Move North as at April 25, 2017        

This scan reveals stocks that are showing signs of moving higher in the nest couple sessions.

Perhaps a good idea to make a note of these and observe them, there may be an opportunity here.


image


Update April 28, 2017


I took the prices above and made a comparison to the close on Friday, April 28. 

You can see that there has not been great appreciation in price.

However, for most of these, there was a gain on the day after they were first posted.

In coming days I will be attempting to refine the scan to get a longer term gain.















Please note that these scans are not suitable for trading advice. Each of these scans is in a development phase, they are being refined as each day passes.
Watchlists are being studied and refinements made, until there is some good degree of reliability, these scans are unsuitable except to record the scans.

          Eclipse Acceleo Day        

The first Eclipse Acceleo Day is being held in Nantes, France on July 10, 2009. Acceleo is a component of the Eclipse Model to Text (M2T) project and the workshop will be an event where users and developers of Acceleo can meet, present their planned extensions, and discuss model-driven engineering.

Topics of interest at the workshop will include MOF-to-Text language, validation with Acceleo, scripting generation, comparisons to other generative engines, and integration of Acceleo in an industrial tool chain. Workshop attendees are welcome to present their work in a 20-minute demo to get feedback from the community. See the event page for participation details.

Eclipse Acceleo Day is co-located with the 10th Libre Software Meeting. The event is free but you must register by July 3 in order to attend.


          Twenty reasons to stop using excel        

Small businesses or even big enterprises are using excel in spite of database system for storing data and preparing reports. At the first instance or at the beginning for the new business, it looks easy, cost effective and efficient way to handle data and reports. Serious concern arises when you have lots of data to manage. You may encounter with following effects if going with excel continuously:

1. Slow execution: A large excel may work slow, for excel, large means a file of some thousand of rows. Macros can be created to speedup your work, but sometimes macros also creates worse scenario.

2. An excel may contain virus. Yes, virus can be easily programmed in excel via macro and attached with excel. It may be a big threat for the security of data as well as security of System in organization. Sharing excels via emails as attachments and downloading can cost security as well as can cause duplication of same files within organization.

3. Easy data manipulation: data can be easily manipulated by any user, this may lead to billion dollar loss to your company. Anybody can alter formula or cell value. We can not track why data is wrong and who made the change(if file is not in sharing mode).

4. Lots of human errors: due to lack of control over entry of validated data, excels are error prone . A simple ctrl+D  by mistake may leads to drastic change into insight. A human error during data entry can not be tracked easily. You can go for validating and tracking the data, however this can exhaust your precious time.

5. Auto data type: according to data, excel itself try to assume data type and format of the data. We need to keep eye on format of the data also every time. For example you can not use 0 in front of any number(like 011) . Excel itself removes if you do.

6. Difficult troubleshooting: An excel expert can apply formula or complex formulas to make calculations easy, however it is difficult to troubleshoot error in that. There are no mechanism to test and troubleshoot a mistakenly changed or wrong formula.

7. Process of business should be domain centric, not tool centric. Though excel is easy to learn and easy to handle, somehow it need more expertise rather then customized software.

8. Excel is a decentralized tool. We need to locate each file in system to get It everytime. If there are huge amount of data in many different files then it may be a disaster to find needful one.

9. Not fit for reporting: collaborative activities like Planning, forecasting, budgeting, and reporting needs a standard and well defined format. While, excel is a very much personalized software, every individual has own format and design of reports, it became very difficult to setup an standard for each persons in collaborated activities. It become a tedious job to consolidate these excel.

10. Not a good analysis tool: after collecting data, analysis part came into scenario. Data analysis is very crucial part of data management. However excel is not enough powerful to analyze data directly. We can do this to some extend using graphical representation but again accuracy and decision making capabilities are the big constraints. Excel is not a project management tool.

11. Data recovery: it is not possible to recover a deleted or lost data in excel. It don’t have backup system like database tool. Loss of data can be disastrous for any business. Computer crash, hardware failure or viruses can mess up your all hardwork as well as permanent loss of important data. Microsoft provided auto -recovery  feature, but it helps to some extent only.

12. User need to develop strong understanding of excel, how to handle data, formulas and macros. Person dealing with data can be non-technical. This may leads to slow process and error prone data.

13. Time consuming : Due to lots of human effort in data entry, formatting, error checking, applying formulas and manipulation of data take lots of time. Sometime it annoy too much and affects productivity.

14. Cost matters : there are lots of option's where you can get same system in less price. The cloud-based version of Excel requires a subscription to Office 365, while desktop editions also take out some dollars from your pocket.

15. Dependency : you can feel lucky if you have macro expert in your company. But think once again, do your business has control on that data or the macro expert. A business should be process dependent, not person dependent.

16. Excel crashes: excel can get crashed while working over large data. Sometime we get “not responding” message and excel got hang. Here we can be in two situations either you can return into normal functioning but need to wait(time taking) or excel can get crashed completely, which may leads to loss of data (fatal) or duplication of same work(again time consuming).

17. No real time update of data. As excels are maintained manually, data entry can take place after event held already. Keeping your spreadsheet updated always a hassle.

18. Sharing means messed up: we can say excel provides multiuser feature if file is in sharing. But actually it is not, because everyone has read and write control and kind of sharing which allow simultaneous working on same cell reference.  This is not relevant in the sense of multiuser aspect.

19. Data duplication: you can not control data duplication at first instance( duplication can be detected using validation in excel, but time consuming).

20. You need to arrange training of complex report logic everytime to new employees. Now a days, on job training is one of the biggest challenge for every industry.

Time to upgrade, release all your Excel stress and get efficient, think over it, do you need a database or customize reporting tool, it is about managing data efficiently and it should be your main agenda before going forward. Focus on your core business, lets adopt customized data handling or customized reporting tool. Here, rather then going with global consultancies, your organization can go for an expert consultant. It will help you, for better data management without investing more money. Few days ago, I visited to my college as a alumni. I suggested to get rid of excels for office use to college. Chairman asked me “how much it will cost”. I smiled and replied “less than fee I paid”. I am working with excel and database together at corporate level and feel that some big enterprises also need automation. Work smarter, not harder. Your resources spending hours manually, extracting data, cutting and pasting to create reports, and then have to repeat the whole thing, time and time again? Consolidating data across sources, crunching numbers and creating reports and dashboards. There are lots of reason to automate for better output and revenue.


          Comment on Nikon D500 Setup Guide Spreadsheet by Fiorenza        
I thank you very much!
          Comment on Nikon D500 Setup Guide Spreadsheet by Belgianmisha        
Thank you very much! Helps a lot to get started with the D500
          Nether Regions        

A fascinating insight, while I was in Europe last week, into the politics and play of regionalism. Work and family gave me two perspectives: a weekend with nephews and nieces on the West Coast of Scotland and a couple of days struggling with spreadsheets in an attic apartment in Barcelona.


          Company Managed Vacation Rental Perks        
It is 5 O'clock in the afternoon and you are about to punch out from your Friday shift. Ahead of you lies a week away from your cubicle. No more excel spreadsheets, no more queries to be run, no more focusing on SEO, and best of all, no more escalations to resolve!...Well, at least for at least another week.
          Mavic Parts Inventory Matrix        
Here's a link to a spreadsheet of a partial Mavic parts list that I normally stock. The green cells are items that I usually have available. Currently, this parts matrix only goes back to 2006, but I plan on adding to it as soon as I have the time. Keep in mind that I have a lot of pre-2006 parts still available that even Mavic no longer stocks. Just click on the yellow RogueMechanic Store link located to the upper right side of this page or click on the Store menu tab above. If you need an item that I...
          Reply #309        
No ,I am a rookie at this, I see things I like. I liked the idea of breaking the weeks down on the spreadsheet, you can always go back and reference easier I would think..see patterns. Good idea
          What's ahead for carbon markets after COP 21        

The following was published in the February 2016 edition of Biores, a publication of the International Centre for Trade and Sustainable Development.

By Anthony Mansell, International Fellow, Center for Climate and Energy Solutions (C2ES)

The new climate deal includes several provisions relevant to market-based emissions reductions efforts. 

At a UN conference in Paris, France in December countries agreed to a new framework for international cooperation on climate change. The “Paris Agreement” ties together nationally determined contributions (NDCs) with international rules and procedures to ensure transparency and promote rising ambition. Paris also provided a future for international market mechanisms as a tool for countries to fulfil their NDCs.

Many NDCs submitted as part of the Paris process demonstrate an enthusiasm for market approaches. Sixty-five governments say they will use international markets and another 24 will consider using them in the future. Many groups such as the Carbon Pricing Leadership Coalition (CPLC) urged support in Paris for the use of market mechanisms and a ministerial declaration issued by 18 governments at the close of the conference was designed to send “a clear signal to the global carbon market…that there is an important role for markets in the post-2020 period.”

The Paris Agreement includes provisions that can advance carbon markets in two ways: by ensuring there is no double counting when countries engage in emissions trading, and by establishing a new mechanism to facilitate trading. In both areas, however, the text provides only broad parameters and important details remain to be decided. This article addresses the current state of carbon markets, their history in international climate agreements, and relevant provisions of the Paris deal – including issues still to be negotiated before it comes into effect.

Carbon market context

Carbon pricing is currently in place in 38 jurisdictions, according to the World Bank, encompassing both carbon taxes and emissions trading schemes (ETS). A number of additional policies are scheduled to enter force between now and 2020 including carbon taxes planned for Chile and South Africa. Ontario will develop an ETS similar to neighbouring Québec and US states Washington and Oregon are considering the same. In terms of scale, the most significant will be a new national ETS in 2017 across China, the world’s largest greenhouse gas (GHG) emitter.

Not all carbon market programmes seek to trade internationally; some focus solely on domestic emission reductions. Nevertheless, bottom-up linkages are already occurring. For example, California and Québec have linked their cap-and-trade programs, making carbon allowances and offsets fungible between programs. There are also ongoing discussions in California about using sector-based offsets that reduce deforestation – known as REDD+ – from Acre, Brazil and Chiapas, Mexico. The EU Emissions Trading System (EU ETS) and Swiss ETS have agreed a link, pending ratification by each.

In addition, the International Civil Aviation Organisation (ICAO) is to decide by the end of this year on the design of a global market-based mechanism (MBM) to reduce emissions from aviation. The MBM would come into force in 2020, around the same time the Paris Agreement aims to be in place.

History of international market mechanisms

Market-based approaches are not referred to in the founding 1992 UN Framework Convention on Climate Change (UNFCCC) document, but were integral to the design of its first sub-agreement, the 1997 Kyoto Protocol.

Under Kyoto, participating developed countries have binding emission limits – “quantified emission limitation and reduction commitments” – inscribed in Annex B of the agreement. They are allocated “assigned amount units” (AAUs) in line with those targets and, to enable least-cost emission reduction, are permitted to trade AAUs and other certified emission units.

Kyoto established three methods for transferring units – either emission allowances or emission reductions – between countries. International Emissions Trading (IET) allows countries that have reduced emissions below their targets to sell excess allowances to countries whose emissions exceed their targets. Joint Implementation (JI) allows Annex B countries to earn emission reduction units (ERUs) through emission reduction or removal projects in other Annex B countries. The Clean Development Mechanism (CDM) allows Annex B countries to earn certified emission reduction (CERs) credits through emissions-reduction projects in developing countries.

Emissions trading under the Kyoto Protocol relies on international oversight. All transfers are tracked using a registry called the International Transaction Log (ITL). A common accounting standard applies to all countries with emission targets. An executive board must approve the methodology CDM projects propose using. Finally, under the Protocol, only the international transfers it sanctions are considered legitimate to fulfil a country’s emissions-cutting obligations.

The Kyoto model provides important infrastructure for an international carbon market. Common accounting procedures ensure that any transfer meets an internationally agreed level of environmental integrity. An AAU allocated to Switzerland represents a metric tonne of emissions measured using the same standard as an AAU allocated to Norway. Common offset methodologies give a blueprint to replicate in projects across the globe. The CDM has been able to issue 1.4 billion credits – each representing a metric tonne of avoided emissions – and mobilise over US$400 billion in investment using this international rulebook for managing offset projects. Moreover, when countries submit their national GHG inventories, any recorded transfers can be verified by checking the international registry thereby reducing the potential for emissions double counting.

The Kyoto Protocol's market mechanisms have, however, lately encountered shrinking participation. One reason has been a reliance on the EU ETS as a source of demand, where low economic growth and restrictions placed on the types of credits has created a generous oversupply of CDM credits.

The Paris Agreement and carbon markets

The Paris Agreement establishes a fundamentally different framework from Kyoto. Rather than binding emission limits, which readily lend themselves to market approaches, the new climate regime requires all parties to undertake nationally determined contributions of their own choosing. As of writing, 187 countries had put forward NDCs, presenting various 2020-2030 target reduction dates.  These contributions are not legally binding and come in many forms, ranging from absolute economy-wide targets to peaking years, carbon intensity reductions, and so on. A new transparency system will apply to all parties, but will be less prescriptive than the accounting of AAUs that underpinned the Kyoto Protocol.

Fitting market approaches into this new landscape poses a different set of challenges. In a literal sense, the Paris Agreement is silent on markets, in that the term does not feature in the text. This is not unusual, the Kyoto Protocol also did not include the term. Instead, the new agreement houses markets under Article 6, geared towards addressing “voluntary cooperation” between parties in achieving their NDCs.

Article 6 recognises that parties may choose to pursue voluntary cooperation in implementing their NDCs. If these “cooperative approaches” involve the use of “internationally transferred mitigation outcomes,” or ITMOs, robust accounting shall be used to avoid double counting. The use of ITMOs are voluntary and authorised by participating parties.

The same article also establishes a mechanism to contribute to GHG mitigation and support sustainable development. The new mechanism will be under the authority of meeting of parties to the Paris Agreement. It has four listed aims including to promote greenhouse gas mitigation while fostering sustainable development; incentivise and facilitate participation by public and private entities who are authorised by a party; contribute to reduction of emissions level in host country, which can also be used by another party to fulfil its NDC; and deliver an overall reduction in global emissions. In addition, emission reductions occurring from the new mechanism must not be double counted. A share of proceeds will be used to cover administrative expenses and assist developing countries to meet the costs of adaptation, which is similar to the share of proceeds under the CDM, a portion of which was channelled to the Adaptation Fund. Article 6.8 and 6.9 contain a framework for promoting “integrated, holistic and balanced non-market approaches.”  

So what comes next? When the CDM, JI, and IET were established under the Kyoto Protocols, the details were not finalised until the Marrakech Accords four years later. Similarly, the COP21 outcome sets a work plan for negotiators to deliberate and decide how the Paris system will work, to be addressed in upcoming UNFCCC meetings.

Cooperative approaches accounting

The existing UNFCCC accounting system is bifurcated between developed and developing economies. Under the Convention, GHG inventories are required each year for industrialised countries, while these are included in national communications submitted every four years for developing nations.

The Paris Agreement establishes an “enhanced transparency framework for action and support,” with built-in flexibility to take into account national capacities. Under this framework each party must submit a national greenhouse gas inventory. An accompanying decision elaborates that all countries – except least developed countries and small island developing states – shall provide these inventories at least biennially.

On markets the Subsidiary Body for Scientific and Technologic Advice (SBSTA) will develop and recommend guidance on how to apply “robust accounting” for cooperative approaches, for adoption at the first session of governing body of the Paris Agreement, known as the CMA . Countries will need to be “consistent” with this guidance, but not necessarily follow it strictly. How to determine if a country’s accounting is consistent is not clarified in the Paris agreement, though it will likely be reviewed as part of the new transparency system.

Pending decisions will provide greater clarity on a number of issues. On ITMOs, it will be useful to define the scope of what can be considered a “mitigation outcome” transferred between countries. Under Kyoto, AAUs serve as a unit of account for transferring obligations, but also define the scope of accepted international transfers. In other words, only transfers involving AAUs are accepted when submitting national GHG accounts. Parties will also need to consider whether other forms of co-operation – such as Japan’s Joint Crediting Mechanism (JCM), which is similar to the CDM, or the bilateral linking of two ETSs –would be considered ITMOs. Transfers involving one or more countries without absolute economy-wide targets could complicate the methodology needed to avoid double counting.

On the accounting system, the CMA could take an active role in facilitating transfers, including through a central registry similar to the ITL. Alternatively, in a more decentralised system, it may require that parties maintain their own accounting – such as double-entry bookkeeping – and rely on the transparency arrangements to provide oversight. The provision referencing ITMOs also requires parties to “promote sustainable development and ensure environmental integrity.” The SBSTA guidelines will need to define these terms and how countries will meet them when undertaking transfers.

Paris “mechanism”

Another accompanying COP decision recommends that the CMA adopt “rules, modalities, and procedures” for the new mechanism at its first session. The parameters for these are: voluntary participation authorised by each party involved; real, measurable, and long-term benefits related to the mitigation of climate change; specific scope of activities; reductions in emissions that are additional to any that would otherwise occur; verification and certification of emission reductions resulting from mitigation activities by designated operational entities; experience gained with and lessons learned from existing mechanisms and approach adopted under the Convention.

This leaves much to be hammered out by governments. A key area to address will be the type of system. The new mechanism may continue to credit at a project level. A Brazilian proposal in Paris envisioned a mechanism similar in scale to the CDM, referred to as an “enhanced CDM,” or “CDM+.” Conversely, in prior discussions for a “new market mechanism” (NMM), both the EU and the Environmental Integrity Group negotiating group have proposed a scaled-up or sector-based crediting mechanism.

The future of the Kyoto flexibility mechanisms is also unclear, in particular whether the new mechanism will succeed the CDM and JI, or will sit alongside either of these. The Paris Agreement does not mention the CDM or JI, but notes that the new mechanism should draw on the experience gained from existing mechanisms. Similarly, it is unclear whether units generated under the Kyoto mechanisms will be eligible for compliance after 2020 and if so, whether they will need to be converted to an alternative credit type to conform with credits issues under the new mechanism.

Negotiators may also decide to transfer project methodologies over from the CDM to apply to the new mechanism, discard some of these existing approaches, or move away from project level crediting altogether as noted above. They may also consider other methodologies used outside the UNFCCC. Finally, the Paris Agreement frames sustainable development on a par with GHG mitigation, so parties may require measured sustainable development outcomes to be eligible for crediting.

Parties will need to decide on governance arrangements for the new mechanism. The CDM is managed by an Executive Board of ten government officials, comprising one member from each of the five UN regional groups, two other members from parties included in Annex I, two other members from non-Annex I parties, and one representative of the small island developing states. Similarly, JI has a supervisory committee (JISC) to oversee the verification of projects. The new mechanism could incorporate governance from either of these existing platforms.Guidance on rules and procedures will also need to be clarified. The CDM and JI have existing procedures for developing projects that are ultimately credited. Countries could transfer these rules to the new mechanism or adopt new procedures.

Given the breadth of views across governments on the role of market mechanisms, reaching conclusions on these issues will be challenging. The slow progress since 2011 in the UNFCCC toward a “framework for various approaches” (FVA) and NMM demonstrated the difficulties in gaining consensus on the subject. Nevertheless the importance afforded to international markets by many countries in their NDCs implies there is a strong impetus to find a workable system for international transfers.

Efforts beyond UNFCCC

It is possible that initiatives undertaken outside the UNFCCC will inform efforts within. The Carbon Market Platform established under the G7, for example, is a strategic political dialogue that can complement the UNFCCC in developing guidance on accounting for international transfers. The system that ICAO builds could seek consistency with the Paris Agreement. For example, it would be beneficial if credits used for compliance in the UNFCCC and ICAO are fungible, to prevent project developers choosing between separate customers. It remains to be decided what types of international credits will be used for compliance in the ICAO MBM, but this should take into account the emergence of the new mechanism. In addition, the accounting system used by ICAO should at least be consistent with that used under the Paris system, insofar as this would avoid the double counting of units used for compliance in both ICAO and the UNFCCC.

Unfinished business

Paris reaffirmed carbon markets as an instrument for meeting climate goals. Outside of the agreement itself, groups such as the CPLC are building strong momentum for market approaches as a key component to meeting the mitigation targets set by NDCs. COP21 did not, however, finalise a new system of international carbon markets or cooperative approaches. Accounting for ITMOs and other forms of voluntary cooperation require elaboration and guidance. The role of the new mechanism remains to be negotiated. And if these talks become stalled, as was the case for the FVA/NMM deliberations, interested countries may pursue bottom-up linkages elsewhere rather than continue to search for solutions within the UN climate talks. The pace and extent of progress under the UNFCCC will determine how central a role multilateral platforms will play on these issues in the future and the prospects of a truly global carbon market.


          CPR Works        
If you are a small business owner who thinks that business has been a little hectic doing things the old fashioned way and you do not own a computer, you may want to make an investment in buying one. A computer is often one of the greatest assets a company can have when it comes to company management as well as success. Here are some of the ways a computer can benefit your small business.





Let us say that your small business is a store, and like all stores, inventory is required in order to keep track of all things that are bought and sold. Is it not annoying to write everything out by hand? Using a computer to make a spreadsheet of your inventory, instead of writing it out by hand, is not only much much easier to do, but it also saves a lot more time. The best thing is that the use of a computer will make inventory a largely paperless procedure, aside from printing. Another advantage a small store owner can have from using a computer is that almost all on line orders are as easy or easier as making orders by telephone and therefore save time wasted on making phone calls and many times being put on hold. A simple point and click will take care of your orders and get you back to handling business.





Let us say that you would like to start selling things from home. While putting out advertisements in your local newspaper is an okay way to get notice, the world wide web is a much more time efficient and speedy way to get whatever it is you need sold gotten rid of. The use of the Internet can also keep you in contact with your buyer. You can even resort to using an on line auction or selling website to sell things from your home.





Another advantage of using computers that small business owners can find is that the computer can simplify almost all tasks. You are able to work in an almost completely paperless environment, so there is less trash. If you are worried that you are too behind the times to be able to use a computer, you do not need to worry. Many of computers today are designed to be new user friendly are surprisingly accessible to almost all new users. Computers also do not take up very much space, so they are a tool that you will find space efficient.


Aydan Corkern is a writer and you can visit his websites, computer repair and computer repair houston

cpr works: CPRWorks

cpr works: CPRWorks

Article Source: www.articlesnatch.com


          The Strange Loop 2013        

This was my second time at The Strange Loop. When I attended in 2011, I said that it was one of the best conferences I had ever attended, and I was disappointed that family plans meant I couldn't attend in 2012. That meant my expectations were high. The main hotel for the event was the beautiful DoubleTree Union Station, an historic castle-like building that was once an ornate train station. The conference itself was a short walk away at the Peabody Opera House. Alex Miller, organizer of The Strange Loop, Clojure/West, and Lambda Jam (new this year), likes to use interesting venues, to make the conferences extra special.

I'm providing a brief summary here of what sessions I attended, followed by some general commentary about the event. As I said last time, if you can only attend one conference a year, this should be the one.

  • Jenny Finkel - Machine Learning for Relevance and Serendipity. The conference kicked off with a keynote from one of Prismatic's engineering team talking about how they use machine learning to discover news and articles that you will want to read. She did a great job of explaining the concepts and outlining the machinery, along with some of the interesting problems they encountered and solved.
  • Maxime Chevalier-Boisvert - Fast and Dynamic. Maxime took us on a tour of dynamic programming languages through history and showed how many of the innovations from earlier languages are now staples of modern dynamic languages. One slide presented JavaScript's take on n + 1 for various interesting values of n, showing the stranger side of dynamic typing - a "WAT?" moment.
  • Matthias Broecheler - Graph Computing at Scale. Matthias opened his talk with an interesting exercise of asking the audience two fairly simple questions, as a way of illustrating the sort of problems we're good at solving (associative network based knowledge) and not so good at solving (a simple bit of math and history). He pointed out the hard question for us was a simple one for SQL, but the easy question for us would be a four-way join in SQL. Then he introduced graph databases and showed how associative network based questions can be easily answered and started to go deeper into how to achieve high performance at scale with such databases. His company produces Titan, a high scale, distributed graph database.
  • Over lunch, two students from Colombia told us about the Rails Girls initiative, designed to encourage more young women into the field of technology. This was the first conference they had presented at and English was not their native language so it must have been very nerve-wracking to stand up in front of 1,100 people - mostly straight white males - and get their message across. I'll have a bit more to say about this topic at the end.
  • Sarah Dutkiewicz - The History of Women in Technology. Sarah kicked off the afternoon with a keynote tour through some of the great innovations in technology, brought to us by women. She started with Ada Lovelace and her work with Charles Babbage on the difference engine, then looked at the team of women who worked on the ENIAC, several of whom went on to work on UNIVAC 1. Admiral Grace Hopper's work on Flow-Matic - part of the UNIVAC 1 project - and subsequent work on COBOL was highlighted next. Barbara Liskov (the L in SOLID) was also covered in depth, along with several others. These are good role models that we can use to encourage more diversity in our field - and to whom we all owe a debt of gratitude for going against the flow and marking their mark.
  • Evan Czaplicki - Functional Reactive Programming in Elm. This talk's description had caught my eye a while before the conference, enough so that I downloaded Elm and experimented with it, building it from source on both my Mac desktop and my Windows laptop, during the prerelease cycle of what became the 0.9 and 0.9.0.2 versions. Elm grew out of Evan's desire to express graphics and animation in a purely functional style and has become an interesting language for building highly interactive browser-based applications. Elm is strongly typed and heavily inspired by Haskell, with an excellent abstraction for values that change over time (such as mouse position, keyboard input, and time itself). After a very brief background to Elm, Evan live coded the physics and interaction for a Mario platform game with a lot of humor (in just 40 lines of Elm!). He also showed how code updates could be hot-swapped into the game while it was running. A great presentation and very entertaining!
  • Keith Adams - Taking PHP Seriously. Like CFML, PHP gets a lot of flak for being a hot mess of a language. Keith showed us that, whilst the criticisms are pretty much all true, PHP can make good programmers very productive and enable some of the world's most popular web software. Modern PHP has traits (borrowed from Scala), closures, generators / yield (inspired by Python and developed by Facebook). Facebook's high performance "HipHop VM" runs all of their PHP code and is open source and available to all. Facebook have also developed a gradual type checking system for PHP, called Hack, which is about to be made available as open source. It was very interesting to hear about the pros and cons of this old warhorse of a language from the people who are pushing it the furthest on the web.
  • Chiu-Ki Chan - Bust the Android Fragmentation Myth. Chiu-Ki was formerly a mobile app developer at Google and now runs her own company building mobile apps. She walked us through numerous best practices for creating a write-once, run-anywhere Android application, with a focus on various declarative techniques for dealing with the many screen sizes, layouts and resolutions that are out there. It was interesting to see a Java + XML approach that reminded me very much of Apache Flex (formerly Adobe Flex). At the end, someone asked her whether similar techniques could be applied to iOS app development and she observed that until very recently, all iOS devices had the same aspect ratio and same screen density so, with auto-layout functionality in iOS 6, it really wasn't much of an issue over in Apple-land.
  • Alissa Pajer - Category Theory: An Abstraction for Everything. In 2011, the joke was that we got category theory for breakfast in the opening keynote. This year I took it on by choice in the late afternoon of the first day! Alissa's talk was very interesting, using Scala's type system as one of the illustrations of categories, functors, and morphisms to show how we can use abstractions to apply knowledge of one type of problem to other problems that we might not recognize as being similar, without category theory. Like monads, this stuff is hard to internalize, and it can take many, many presentations, papers, and a lot of reading around the subject, but the abstractions are very powerful and, ultimately, useful.
  • Jen Myers - Making Software Development Make Sense For Everyone. Closing out day one was a keynote by Jen Myers, primarily known as a designer and front end developer, who strives to make the software process more approachable and more understandable for people. Her talk was a call for us all to help remove some of the mysticism around our work and encourage more people to get involved - as well as to encourage people in the software industry to grow and mature in how we interact. As she pointed out, we don't really want our industry to be viewed through the lens of movies like "The Social Network" which makes developers look like assholes!.
  • Martin Odersky - The Trouble with Types. The creator of Scala started day two by walking us through some of the commonly perceived pros and cons of both static typing and dynamic typing. He talked about what constitutes good design - discovered, rather than invented - and then presented his latest work on type systems: DOT and the Dotty programming language. This collapses some of the complexities of parameterized types (from functional programming) down onto a more object-oriented type system, with types as abstract members of classes. Compared to Scala (which has both functional and object-oriented types), this provides a substantial simplification without losing any of the expressiveness, and could be folded into "Scala.Next" if they can make it compatible enough. This would help remove one of the major complaints against Scala: the complexity of its type system!
  • Mridula Jayaraman - How Developers Treat Ovarian Cancer. I missed Ola Bini's talk on this topic at a previous conference so it was great to hear one of his teammates provide a case study on this fascinating project. ThoughtWorks worked with the Clearity Foundation and Annai Systems - a genomics startup - to help gather and analyze research data, and to automate the process of providing treatment recommendations for women with ovarian cancer. She went over the architecture of the system and (huge!) scale of the data, as well as many of the problems they faced with how "dirty" and unstructured the data was. They used JRuby for parsing the various input data and Clojure for their DSLs, interacting with graph databases, the recommendation engine and the back end of the web application they built.
  • Crista Lopes - Exercises in Style. Noting that art students are taught various styles of art, along with analysis of those styles, and the rules and guidelines (or constraints) of those styles, Crista observed that we have no similar framework for teaching programming styles. The Wikipedia article on programming style barely goes beyond code layout - despite referencing Kernighan's "Elements of Programming Style"! She is writing a book called "Exercises in Programming Style", due in Spring 2014 that should showcase 33 styles of programming. She then showed us a concordance program (word frequencies) in Python, written in nine different styles. The code walkthrough got a little rushed at the end but it was interesting to see the same problem solved in so many different ways. It should be a good book and it will be educational for many developers who've only been exposed to one "house" style in the company where they work.
  • Martha Girdler - The Javascript Interpreter, Interpreted. Martha walked us through the basics of variable lookups and execution contexts in JavaScript, explaining variable hoisting, scope lookup (in the absence of block scope) and the foibles of "this". It was a short and somewhat basic preso that many attendees had hoped would be much longer and more in depth. I think it was the only disappointing session I attended, and only because of the lack of more material.
  • David Pollak - Getting Pushy. David is the creator of the Lift web framework in Scala that takes a very thorough approach to security and network fallibility around browser/server communication. He covered that experience to set the scene for the work he is now doing in the Clojure community, developing a lightweight push-based web framework called Plugh that leverages several well-known Clojure libraries to provide a seamless, front-to-back solution in Clojure(Script), without callbacks (thanks to core.async). Key to his work is the way he has enabled serialization of core.async "channels" so that they can be sent over the wire between the client and the server. He also showed how he has enabled live evaluation of ClojureScript from the client - with a demo of a spreadsheet-like web app that you program in ClojureScript (which is round-tripped to the server to be compiled to JavaScript, which is then evaluated on the client!).
  • Leo Meyerovich - Thinking DSLs for Massive Visualization. I had actually planned to attend Samantha John's presentation on Hopscotch, a visual programming system used to teach children to program, but it was completely full! Leo's talk was in the main theater so there was still room in the balcony and it was an excellent talk, covering program synthesis and parallel execution of JavaScript (through a browser plugin that offloads execution of JavaScript to a specialized VM that runs on the GPU). The data visualization engine his team has built has a declarative DSL for layout, and uses program synthesis to generate parallel JS for layout, regex for data extraction, and SQL for data analysis. The performance of the system was three orders of magnitude faster than a traditional approach!
  • Chris Granger - Finding a Way Out. Some of you may have been following Chris's work on LightTable, an IDE that provides live code execution "in place" to give instant feedback as you develop software. If you're doing JavaScript, Python, or Clojure(Script), it's worth checking out. This talk was more inspirational that product-related (although he did show off a proof of concept of some of the ideas, toward the end). In thinking about "How do we make programming better?" he said there are three fundamental problems with programming today: it is unobservable, indirect, and incidentally complex. As an example, consider person.walk(), a fairly typical object-oriented construct, where it's impossible to see what is going on with data behind the scenes (what side effects does it have? which classes implement walk()?). We translate from the problem domain to symbols and add abstractions and indirections. We have to deal with infrastructure and manage the passage of time and the complexities of concurrency. He challenged us that programming is primarily about transforming data and posited a programming workflow where we can see our data and interactively transform it, capturing the process from end to end so we can replay it forwards and backwards, making it directly observable and only as complex as the transformation workflow itself. It's an interesting vision, and some people are starting to work on languages and tools that help move us in that direction - including Chris with LightTable and Evan with Elm's live code editor - but we have a long way to go to get out of the "tar pit".
  • Douglas Hofstadter, David Stutz, a brass quintet, actors, and aerialists - Strange Loops. The two-part finale to the conference began with the author of "Gödel, Escher, and Bach" and "I am a Strange Loop" talking about the concepts in his books, challenging our idea of perception and self and consciousness. After a thought-provoking dose of philosophy, David Stutz and his troope took to the stage to act out a circus-themed musical piece inspired by Hofstadter's works. In addition to the live quintet, Stutz used Emacs and Clojure to provide visual, musical, and programmatic accompaniment. It was a truly "Strange" performance but somehow very fitting for a conference that has a history of pushing the edges of our thinking!

Does anything unusual jump out at you from the above session listing? Think about the average technical conference you attend. Who are the speakers? Alex Miller and the team behind The Strange Loop made a special effort this year to reach out beyond the "straight white male" speaker community and solicit submissions from further afield. I had selected most of my schedule, based on topic descriptions, before it dawned on me just how many of the speakers were women: over half of the sessions I attended! Since I didn't recognize the vast majority of speaker names on the schedule - so many of them were from outside the specific technical community I inhabit - I wasn't really paying any attention to the names when I was reading the descriptions. The content was excellent, covering the broad spectrum I was expecting, based on my experience in 2011, with a lot of challenging and fascinating material, so the conference was a terrific success in that respect. That so many women in technology were represented on stage was an unexpected but very pleasant surprise and it should provide an inspiration to other technology conferences to reach beyond their normal pool of speakers too. I hope more conferences will follow suit and try to address the lack of diversity we seem to take for granted!

I already mentioned the great venues - both the hotel and the conference location - but I also want to call out the party organized at the St Louis City Museum for part of the overall "wonder" of the experience that was The Strange Loop 2013. The City Museum defies description. It is a work of industrial art, full of tunnels and climbing structures, with a surprise around every corner. Three local breweries provided good beer, and there was a delicious range of somewhat unusual hot snacks available (bacon-wrapped pineapple is genius - that and the mini pretzel bacon cheeseburgers were my two favorites). It was quiet enough on the upper floors to talk tech or chill out, while Moon Hooch entertained loudly downstairs, and the outdoor climbing structures provided physical entertainment for the adventurous with a head for heights (not me: my vertigo kept me on the first two stories!).

In summary then, the "must attend" conference of the year, as before! Kudos to Alex Miller and his team!


          mahalloway on "Interactive excel database with multiple users"        

Hello,

I run a fantasy golf sports league. I currently use a custom excel spreadsheet in which I keep everyone's data entries.

I am looking to have my league members go to my website and enter in their weekly pics. Basically I have a sheet for each member as well as a standings and overall placement sheet.

On each members sheet, is a list of 1000 golfers they can choose from in 2 columns, as they can only choose a golfer twice through the year. Within that, I also have a placement column for their weekly picks. It is formulated to apply a certain amount of points depending on the golfers final placement.

I would like for each member to have their own login, so they cannot see anyone else's picks.

Is there a way for me to embed the database, allow for members to select their weekly picks and allow for them not to see other members pics?

Thank you in advance and sorry for rambling.

Mark
18fantasyholes.com


          Financial Derivatives Part 1 Seminar-Training - PI ETA Consulting Company , UAE, Philippines, Saudi Arabia, Malaysia, Qatar         

PFD01 Financial Derivatives Part 1

Highlights 

  • In living life, the secret of the How's is actually in the Why's!
  • History is actually an important subject. Knowing history well tells us why we are here today. Knowing history really well can sometimes even tell us where we are heading into the future!
  • Have a good understanding of the Foreign Exchange and Interest Rate markets, and the main Treasury products available in these markets, including Derivative Instruments.
  • Acquire a clear understanding of Financial Derivatives through focusing on the essential Mathematical Concepts that form the building blocks of these instruments.
  • Master the dynamics of Financial Derivatives as part of an extended toolbox of Financial Risk Management, which will in turn increase optimality in hedging strategies.
  • Have an understanding of how knowledge in Derivatives in the Foreign Exchange markets can also be extended to include Equity and Commodity market derivative instruments.
  • Have a hands-on opportunity to build Yield Curves - the backbone of most financial derivative products using Spreadsheets.
  • Usage of The PERMIT® Treasury & Financial Risk Management System and practical experience on structuring and pricing Financial Derivatives.

Seminar Facilitator

Dr. Jeffrey C. K. Lim

PhD, CSci, CMath, FIMA, FRM, PRM, BFel

===================================================

FTS Eligible (Funding)

This programme is approved for listing on the Financial Training Scheme (FTS) Programme Directory and is eligible for FTS claims subject to all eligibility criteria being met. 

Please note that in no way does this represent an endorsement of the quality of the training provider and programme. Participants are advised to assess the suitability of the programme and its relevance to participants' business activities or job roles.

The FTS is available to eligible entities, at a 50% funding level of programme fees subject to all eligibility criteria being met. FTS claims may only be made for programmes listed on the FTS Programme Directory with the specified validity period. 

For Singapore citizens aged 40 and above, FTS provides 90% Funding, subject to existing grant caps.


Cost:

Certified


          Australia ASX 300 Overextensions         

Over the last two days I have created spreadsheets for the constituents of the S&P500 and the FTSE-350; ranking them by overextensions relative to the trend mean. Today I am conducting the same exercise for Australia’s ASX 300.

This is a particularly illiquid time of year and it takes less capital for traders to move markets. This is easiest where accelerated moves are in evidence, stops will have been placed and algorithmic systems have little difficulty identifying them.


          How to Remodel Like an Engineer        
Here’s one of my favorite kitchens, built by Corona homeowner Rick Taylor for $7,500 during a three-week vacation. But here’s the best part: Rick’s wife Stephanie did not want her household with two small children disrupted by the remodel. To address her concerns, Rick produced a spreadsheet showing her precisely how he could get it done in a short amount of time. And he pulled it off. Has that ever happened in the history of the universe, where the husband was pushing the wife to do a remodel? I don’t think so. If you want to execute a similarly well-planned remodel, here’s some advice: 1. Be an engineer (like Rick) or think like one. You must be annoyingly organized to complete a DIY project on a tight timeframe. This is not for artistic, go-with-the-flow types. 2. Get as much training as possible on installing cabinets, setting tile, etc., via workshops at home improvement stores, from TV shows, and with books and magazines. 3. Be willing to compromise. While Stephanie was fed up to here with dingy tile grout lines and wanted a solid surface counter, Rick determined that would blow the budget. As an alternative he offered her larger tiles [...]
          CPAN RPMs        

If you’ve been reading my blog for a while, you’ll know that I have an interest in packaging CPAN modules as RPMs for Linux distributions based on Red Hat. For a few years, I’ve been (infrequently) building spreadsheets which list the various modules that are available as RPMs from the better know repositories (and my …

The post CPAN RPMs appeared first on Perl Hacks.


          Pricing Projects        

Pricing Projects

I see the topic of pricing work come up over and over again, so I thought I would share my thoughts on the matter. By no means am I a guru of pricing or anything, but I have come up with a decent system that seems to work for me. I am currently a “full time” woodworker, but haven’t been for long. There are many more experienced folks on LJ’s, so take this with a grain of salt.

I start with the same master excel spreadsheet for every project. The spreadsheet has three separate sections – Labor, Materials, and Overhead.

I begin with labor. The first step is to pick an hourly rate for yourself. This can be hard, but I believe that in any for-profit situation (not a favor for a family member or anything like that) your hourly wage rate needs to be at least in the $15 range. For reference, I am in Fort Worth, Texas if you want to know what region I am basing that wage rate off of. I charge more than that now, but on my first few projects, I didn’t. It is really hard picking this rate for yourself because you feel like you are grading your own test. But remember, you are the guy with the tools and the knowledge. They came to you for a reason. It is because you possess skills and knowledge that they don’t have. That is worth money. Try as hard as you can not to short yourself.

The second step of the labor estimate is to painstakingly think through the entire build process, activity by activity. This usually takes me 2 hours or so. When I start thinking through my build sequence, I literally talk through how I am going to build the entire project. This is how the conversation with myself goes, this gets a little schizophrenic, but hang with me:

“Self, first you are going to have to go to the lumberyard. That will take you about 4 hours total, if you include driving and picking through lumber.”
- Punch in 4 hours for material pickup
“Self, after that, you will have to unload at the shop. That’ll take an hour”
- Punch in 1 hour for unloading
“Self, next you will have to do all of the milling. You can probably do that in 8 hours”
“No you can’t, idiot.”
“Ok, 10.”
“Get real, doofus”
“Ok , let’s just go with 12.”
“That’s better”
- Punch in 12 hours for milling

And so on. Now, as you are punching in numbers for hours and thinking through your build process, you should also be considering all of the materials you will be using for each step and putting them over in your material section. At the end of the labor section you should have a good idea not only of how long this is going to take, but also how you are going to do it. Once I am done with labor, I usually work through the build sequence once more, this time really focusing on materials. After the second time through the build process, you should have a very good labor and material estimate. 2 tips for the manhour estimate
1) Don’t skip small stuff like material purchase, clean up, delivery, etc. That stuff adds up. And despite the fact that you don’t feel like you are working when you are doing it, you are.
2) Don’t overestimate your speed. I did that once and got obliterated on a price. Underestimate if you need to. Be realistic.

Overhead is a smaller chunk of the pie, especially for someone who doesn’t do much volume per year. Over the past year, I have done a decent amount and will be claiming my profits on my personal tax forms, so I factor in taxes, profit (10%ish depending), electricity, and gas in this section. I know taxes aren’t something that come up with a lot of you, but if you neglect state and income tax and then end up having to pay them in the end, you will likely lose 30%ish of your revenue, not just your profit. That is why I try to keep that stuff in order and build in a buffer so that when I get a serious negative tax return, I have money in the bank to pay for it (just don’t spend that money on go-karts and stuff ).

At the end, I just add up all labor, material, and overhead. That is my final price. I look at it every time and think it is astronomical. I tell myself that I have to lower my hourly rate. Or I have to do without some of the materials. Anything to bring the price down. But then I realize that despite my own reservations about how good I am and what I am worth, I am still probably better at woodworking than at least 99.5% of the population. And a lot of you likely are too. Not many people know as much about woodworking as us dorks, and we have to use that to our advantage. Our knowledge base, accrued skills, and time are worth something. So don’t screw yourself over.

In my opinion, you HAVE to be systematic when pricing projects. Shooting from the hip works sometimes, but I would advise against it. For example, just doubling or tripling material cost will murder you on a very labor-intensive job with small amounts of wood. If you work through the entire process every time, you are much less likely to get burned.

I’d be happy to send my excel spreadsheet to you for your use, but I don’t think we can attach things onto blog posts. No guarantees that it is a flawless financial document or anything, but it is a decent place to start. PM me if you are interested in that.

Hope you enjoyed the read. Comments and criticism welcome.


          Raid Schedule.. Please sign up!        
https://docs.google.com/spreadsheets/d/1OT6kr_MC8fkIH5jQKhwXPqRmQI-ZriVdOlBz1kEeYx0/edit?pref=2&pli=1#gid=0
          Packaging Engineer - Lactalis American Group - Buffalo, NY        
Microsoft Excel Spreadsheet software and Microsoft Word Processing software and Power Point software, Palletization software (TOPS or CAPE), AutoCAD....
From Lactalis American Group - Thu, 22 Jun 2017 21:02:49 GMT - View all Buffalo, NY jobs
          "Privacy is dead, get over it" [updated]        
I believe it was in 1999 that Scott McNealy famously said "privacy is dead, get over it". It is a whole lot deader now than it was then. A month ago in Researcher Privacy I discussed Sam Kome's CNI talk about the surveillance abilities of institutional network technology such as central wireless and access proxies. There's so much more to report on privacy that below the fold there can't be more than some suggested recent readings, as an update to my 6-month old post Open Access and Surveillance. [See a major update at the end]

There are four main types of entity motivated to violate your privacy:
  • Companies: who can monetize this information directly by selling it and indirectly by exploiting it in their internal business. Tim Wu's The Attention Merchants: The Epic Scramble to Get Inside Our Heads is a valuable overview of this process, as is Maciej Cegłowski's What Happens Next Will Amaze You.
  • Governments: both democratic and authoritarian governments at all levels from nations to cities are addicted to violating the privacy of citizens and non-citizens alike, ostensibly in order to "keep us safe", but in practice more to avoid loss of power. Parts of Wu's book cover this too, but it at least since Snowden's revelations it has rarely been far from the headlines.
  • Criminals: can be even more effective at monetizing your private information than companies.
  • Users: you are m